Dell - ESUPPORT

Role: Senior UX Designer
Responsibilities: Research (author, analyst, documenter) / Design (UX, UI)
Software: Figma / Miro / usertesting.com
Timeline: 6 sprints (3 months)

As a Senior UX Designer at Dell, one of the areas I focused on within the eSupport team was revenue generated on the platform. The hardware and software diagnostic services were identified as flows that could be improved for both internal business as well as eSupport customers. The redesign brought together a more streamlined, simplified process, focused UI, and refined content (all revised throughout multiple rounds of research and user testing). This combination resulted in a much-improved customer experience and dramatic increase in desired business metrics outlined below:

  • Customers who added a service to their carts jumped from 474 to 989 in Week 1

  • Revenue increased from $24K to $61K Week 1, $71K Week 2, and $89k Week 3

  • Projected to earn an additional $2M in annual revenue


A more in-depth breakdown about the project can be found below. Due to NDAs and proprietary information, images and certain details related to this project are not able to be shared.


DEFINE

This project stemmed from business evaluations of current offerings for customers across multiple channels/verticals and identifying what areas could be improved. In an initial stakeholder meeting, more in-depth details and data were reviewed within eSupport highlighting that customers seeking help for issues with out-of-warranty devices were able to locate where they could initiate the diagnostic service request, but few were actually purchasing the service.

This insight indicated the demand for these services was there, but something(s) prevented customers from purchasing the offer. A few reasons were considered:

  • Customers decided the price was more than they wanted to pay for a diagnostic service

  • It wasn’t clear what service was being offered and customers were unsure what they were purchasing or what they should be purchasing for respective their issue

  • The amount of steps and information presented to complete signing up for the service was too long and/or burdensome for the customer

As these potential reasons were only assumptions based on customer behavior, we began preparing a usability test. This initial research involved observing users interacting with the existing hardware and software diagnostic service flows. Being led by simple test scripts, the pain points soon became apparent as participants expressed confusion and frustration around finding the offers on the page, determining which service was appropriate for their given test scenario, and how/when it would be completed.

Clearly there were gaps in both the UI of how/where offers were being presented on the page as well as the content not clearly describing the details of the available services. Based on that feedback, we began exploring how the design and content could be improved to alleviate those issues.


DISTILL

With the feedback gathered from usability testing, the content strategist and I began re-evaluating both the overall design of the page and the content being presented.

From the design perspective, testing showed that participants had a hard time identifying the offers on the page to begin with. The eSupport flow utilized a single-page progressive disclosure platform, building on itself to show relevant options as the user made selections. Due to this, certain design components, like cards detailing offers, could appear partially below the fold where the CTAs were shown. Beyond that, the cards all had the same visual weight and appearance. There was nothing distinct about one from the other causing them to blend together. This made it not only difficult for users to initially identify them when loaded onto the page, but also difficult in determining which was the best service for their issue.

On the content side of the equation, there were also multiple points to address. First, the dropdown presenting the list of issues wasn’t the most intuitive, especially for customers who may not be tech-savvy enough to determine or define what exactly the issue they were experience was. Beyond that, the content presented within the cards describing the services wasn’t highlighting the benefits and what differentiated one from another to help customers make an informed choice.

Moving beyond the page design, once a customer selected one of the offers by clicking its respective CTA, it would open a modal flow further outlining the service offerings and additional selections depending on the type of service. In the existing experience, this was an additional 4-5 modal steps of information. This was another part of the experience that initial feedback indicated reframing the content and design into a more direct, consolidated approach would simplify the process for the customer and increase the likelihood of them purchasing the service.


DESIGN

Now that the areas of improvement had been clearly identified, the content strategist and myself worked in coordination to begin redesigning the experience.

To address customers overlooking the offers being presented, we implemented a ‘recommended’ offer displayed as a tag at the top of the card along with outlining the card in a green stroke to match. This helped to both draw attention to the offers while also distinguishing one from another and provide feedback to customers of which would likely be the appropriate selection based on their selected issue.

Based on the amount of content originally being displayed within the cards, we explored implementing an expanding/collapsing interaction that would show an initial summary with more details displayed once the CTA was selected. We ultimately decided against this solution for a few reasons. First, that interaction model required an additional click to complete the process. While not a deal-breaker, in an experience we are trying to streamline, anywhere we can remove clicks in the process helps us better achieve that goal. Second, A/B testing revealed the expanding/collapsing cards didn’t perform as well against alternate designs we explored. We discovered an approach combining more focused content giving the necessary details at that step with a shortened card height able to display everything above the fold was the most well-received by test participants.

The last primary focus to improve the experience was reducing the amount of steps in the modal flow to have an offer added to the cart. We achieved this by reevaluating the information architecture to determine what was or was not needed. The content strategist also refined the how the information was presented, making it more descriptive yet concise along with being more conversational to make it seem more familiar to customers. We were able to reduce the hardware service offer from 5 modals to 2 in the flow to be added to the cart, and were similarly able to reduce the software service offer from 4 to 2 modals. This greatly reduced the cognitive load presented to customers while still providing enough information for them to confidently purchase the service being offered.

As referenced previously, usability testing was performed throughout the design process as we worked through various approaches. As research tends to do, certain feedback surprised our assumptions of what would perform best. This feedback influenced multiple design decisions including the colors and placement of buttons and other graphic elements, the balance of the amount content being presented and when in the process, and the best labels for services to make them easy to recognize and understand at a glance.


DELIVER

Once the design and content was positively reviewed by test participants and approved internally with various stakeholders, I prepared the project to be handed off to development. Members of the development team had been involved from the start of the project to inspect functionality and add their insight regarding any limitations or improvements that could impact the experience.

The finalized design files were labeled as such and provided in a separate location from the working files to ensure there was no confusion around which files developers should be referencing. A redline document was also provided to detail the specs of various design elements including type size and color where it deviated from the design system, padding/spacing between components, and any notes regarding interactions. QA was also performed once development completed their production build to ensure the final experience matched what was designed.

As outlined in the summary above, after the new design was implemented there was an immediate increase in the number of customers adding these services to their cart as well as overall revenue from the purchased offers. Projections indicated it should result in a $2 million increase in annual revenue.

By the end of the project, not only did the bottom line benefit but more importantly the customer experience improved by allowing them to clearly understand the services being offered and to get the help they needed to repair their device in a more relatable, efficient process.