Skills Study
Type of work: User Research, Content & Communication Design, Lo-Fidelity Wireframing, Interaction Design
Date: February 2017-June 2018
Time commitment: ~80%
Brief
The People Development team in ONE Design works to create an inclusive environment where learning is persistent, development is supported and contributions are recognized. Upon joining the team, one of the largest gaps made noticeable by the rapid scale was the lack of understanding of our make up of skills and expertise. I was tasked with leveraging the full end-to-end strategic design process to fill this gap and create an entirely new service.
The Challenge
At this time, design leaders didn’t have the information they needed to approach partners for resourcing or strategic planning. Managers didn’t have insight into their associates perceived ability to appropriately guide development journeys. Associates didn’t have a common language to talk about their skills, where they wanted to grow and what they were interested in with their managers. And our learning experiences team didn’t have data on what skills designers thought they needed as the practice evolved for the future. The challenge was to create a mechanism for capturing data on skill sets and visualize it ways that our multiple stakeholders would find useful.
Stakeholder Mapping
Before diving into creating a system for collecting data, we started by understanding the different audiences that would consume the information and what mattered most to them. We quickly realized that collecting the data was only half the battle and that data visualization was equally, if not more important. We also recognized that we needed a simple, quick way to communicate this effort as we’d be engaging with multiple stakeholders over the life-cycle of the initiative.
REsearch, Identifying allies & Content DEsign
Fortunately for us, the design community at large is great at sharing observations and perspectives over the internet. It was easy to find interpretations and definitions of skills so we found that all we had to do was evaluate them against how they manifested within ONE Design. To do this, we partnered with allies across the org who were well-known for their expertise to define our design disciplines and the skills that rolled up underneath them.
Consolidating information across multiple people presents itself as just that— written by many minds. One of the longest and most tedious exercises was massaging the content so that it was clear, concise, and representative of the breadth and depth of expertise. Once refined, we returned back to our allies to verify core messaging was not lost.
Resourcing & Product Requirements
As mentioned before, we realized early on that this initiative was larger than originally scoped and resourced. While the skill sets on the current team were suited for research and content consolidation, we lacked a front-end developer and data visualization designer to bring it to life. Through our networks, two people raised their hands for the challenge and we later abstracted our MVP from the desired final outcome to inform feature build-out.
Lo-Fidelity Wireframing
I’m a visual processor and find that I gain the most alignment when I communicate my ideas in visual form. Working from the MVP and feature requirements, I created lo-fidelity interaction and wireframe design to show the flow of data collection and how it translated to visual feedback.
Pilot Testing
Running in parallel with the product build-out, was a pilot to test content on real designers and their managers. We tapped a single LOB team made up of 40 people to take the Skills Study, asking them to provide feedback on the experience and perceived usefulness of the data. This exercise was beneficial for a number of reasons. It flagged us to re-organize skill and discipline mapping in a way that was more intuitive for the user. We learned that some users feel strongly about privacy and want the option to share their data with colleagues. And lastly, we timed how long the assessment took so we could inform later iterations of what to expect and how much time to set aside.
Hi-Fidelity Refinement
Our development and visualization resources were amazing at bringing our vision to reality and consistent with branding standards. The final product was intuitive, interactive, and kept privacy preferences in mind.
Roll-out & Implementation
Support for this initiative was paramount throughout the product life-cycle. We met consistently with allies and other stakeholders to keep them abreast of updates and timelines. Roll-out was phased to a limited number of LOB teams first then more broadly out to everyone to confirm site reliability. In order to ensure adoption, we knew Skills Study couldn’t live separate from existing Capital One and ONE Design processes so we worked closely with our HR partners to eliminate redundancies and build in integrations when necessary. We also created wrap around support and resources to address any questions or concerns that may arise.
Outcomes
Almost 80% of all designers have taken the Skills Study and used it to guide their development conversations with their managers. Our design people managers are better equipped in matching work to current ability and growth paths. Leaders have a consistent way to talk about skills across the team and are able to make informed hiring and project allocation decisions. Our HR partners more clearly understand our design disciplines and skills which they now leverage in job descriptions, performance frameworks, and talent management systems.
We also continue to look for opportunities to use skill affinities as a mechanism for conversation and mentorship. One way we did this was at our yearly All Hands. Upon check-in, individuals received a visual representation of their skills make-up and were asked to outwardly self-identify as an “expert” or “want to learn” with color-correlated stickers.
Lastly, we recently submitted an application to patent our product and method of visualizing data.
Lessons Learned
Supporting evidence for data collection and visualization of skills was strong for leaders, managers, and supporting org teams. Evidence was not as strong for the individual designer or user. Regular, consistent adoption has been slow because the value back to them wasn’t immediate, directly facilitated or contextually relevant. We wished we had performed additional empathy interviews to understand more deeply what mattered to them and desired use cases for this data.
Pushed for time, we didn’t fully account or think through the longevity of the product. And in reality, this product is a service. The team and I are still currently figuring out when to re-calibrate the population, account for new hires, org changes and how that manifests within the tool.