get in touch

Using Data Analysis to Transform the Customer Experience of Hiring at a Large Federal Organization

Imagine you stepped in to lead an organization, only to find yourself dealing with unhappy customers in a pain-point filled customer experience on one side and senior management mandating major change management to how you do business on the other. The pressure is intense and it is difficult to know which fire to fight first.

This is a real-life situation a recent client found themselves in. With The Clearing’s help, the right plan, and smart use of data, our client was able to not only put out both fires but come out ahead on both fronts. Here’s how.

The Situation

The organization in question is a Federal agency recruitment and on-boarding team (consisting of Human Resources and Security) providing such services to multiple Department of Defense (DoD) and Federal customers. Specifically, they help agencies identify and employ the best-fit candidates.

However, its customers were frustrated with the organization’s hiring processes – particularly how long it took. This resulted in customers selecting other providers to meet their hiring needs. Further, the organization was also faced with a Federal mandate to reduce the time-to-hire from a historical average of over 90 days to under 70 days within one year, and down to 45 days within 5 years.

In order to remain a key shared service provider and trusted partner in this space, the organization sought support in reviewing current business operations, including identifying pain points in the system and how best to improve the customer experience – ideally repairing customer trust along the way. In addition, the organization had perceptions of internal obstacles in the process.

The Plan

We started by identifying available data and using it to find and prioritize the pain points reported by customers. This analysis surfaced three primary pain points:

Timeline: The length of time it took to move an applicant through the hiring pipeline
Continuity: The consistency in hiring practices across all customers
Engagement: The limited customer engagement throughout the hiring process

To better understand these customer pain points, we deployed Performance Data Analysis in the hiring process to understand the workflow. This analysis revealed critical insights into the process, including:

The mean, median, mode, and standard deviation of time at each step in the hiring process. Some steps naturally take longer than others; however, using data analysis allowed us to identify the outliers and zero in on the issue.
Which steps presented the biggest challenges.
The prevalence, and subsequent effects, of errors at each step in the process. These errors are often what slow the process down.
How the timelines were affected by unnecessary steps and by deviations from standard operating procedures.

Further analysis diagnosed which aspects of the hiring process needed the most attention. It also illuminated how much benefit the organization and customers would realize by addressing any of the pain points. This information, paired with Customer Experience data, was used to create hypotheses around what caused performance and timeliness issues.

We utilized our rapid prototyping process, Amperian Cycle(R), to quickly design solutions to address the problems identified and redesign the hiring process to better meet customer needs and reduce the time-to-hire. This approach allowed our client to drive superior Customer Experience, transform outdated processes, anticipate future customer needs, and re-imagine the way operations are conducted.

The Impact

Most importantly, implementation of the pilot achieved a 22 percent improvement in time-to-hire. In addition, the redesigned hiring process featured a number of initiatives to increase efficiency and transparency, including:

Dedicated customer teams with a 60% reduction in hand-offs within the process
Using prescriptive vs. historical reporting to provide real-time views of each step within every hiring action
Increased quality control practices to reduce costly errors
A shift from linear to parallel processing of actions to expedite the process
Increased and proactive customer communication

Finally, the reduction in time-to-hire meant the organization met the mandate to reduce hiring time to 70 days. Even better, the prototyping process identified further opportunities to add efficiency and reduce error, putting the organization on a path to meet its five-year goal.

Those opportunities include:

Continued reduction of operational silos
Continued process standardization and transparency
Additional customer communication
Increased accountability

Interested in learning more about The Clearing’s Impact Analysis methodologies, customer experience strategies, change management, or believe your organization needs help sharpening processes? I’d love to chat. I can be reached at nathan.toronto@theclearing.com.

Buzzwords Defined: Big Data

At the dawn of the information age, the term “big data” is bandied about by everyone from media luminaries and consultants to tech CEOs and government officials. Following earlier Buzzwords Defined posts on prototyping and human-centered design, this post demystifies big data and highlights the opportunities that big data presents.

Contrary to popular belief, the biggest challenge with big data is not technological, but social. Relying on technical solutions like machine learning, natural language processing, and artificial intelligence will only go so far in the information age.

With increasing amounts of data and processing power, the greatest advances in managing big data will be in creating human organizations that can think critically and creatively at every level, from entry-level analyst to CEO.

Processing massive amounts of information in clever ways is not enough to ensure success; human systems need to modernize just as much as machine systems do.

Definition

Big data isn’t “big” just because there’s a lot of it. Big data is defined by three factors (the three “Vs”): volume, variety, and velocity.

Volume: Yes, big data involves large amounts of data, usually at a volume that is effectively impossible to store on a local computer. How data is stored also matters, though, whether it’s on the cloud or on the premises (“on-prem”), since this impacts how data scientists and other users can access and analyze it.
Variety: Big data also involves a variety of structured data (in traditional spreadsheet form, such as sales figures or case processing data) and unstructured data (such as social media posts, video files, or news articles). These datasets might also be related to one another in a variety of ways; data scientists often find themselves developing creative means to clean and combine datasets to produce useful insights.
Velocity: Big data relies on combining and analyzing data at speeds that allow it to influence decision-making or streamline business processes, especially in real time. In order to be useful, there must be ways to make big data accessible in a way that matters—the right data, to the right people, at the right time.

Opportunities

We operate in a data-rich world, but this doesn’t mean that we are informed. To turn data (a collection of facts) into information requires the coordination of human and machine systems. This is the key to structuring and organizing data, and it is an opportunity for organizations to stay on the leading edge of the information age. Organizations that store and process the right information efficiently hold a competitive advantage over organizations that don’t.

To succeed in the information age, leaders need to go beyond outsourcing data-driven thinking to the “data folks.” Leaders need to be involved because the structure and culture of their organizations determine the extent to which they can exploit big data for their advantage.

Consider some ways that leaders can influence the human organizations they lead to exploit big data:

Data Standards and Governance: It is not enough to declare how data should be managed. It takes leadership at the everyday level to create the human behavior necessary to maintain these standards. This is the only way for leaders to have confidence that their data is telling them what they think it’s telling them.
Data Cataloging: At least as important as the standards and governance that produce data quality is knowing what data is on-hand. This is especially important when dealing with a wide variety of data, from structured to unstructured. Data cataloging is also important for assessing different levels of quality in data, and maintaining a data catalog is a human leadership challenge.
Aligning Structured and Unstructured Data: Even the most clever computer algorithms can’t align structured and unstructured data without some level of human intervention (either at the coding or the cleaning and merging stage). Leaders can provide the human resources in order to make these important links a reality.
Incubating Innovation: Setting the conditions for innovation is an inherently human activity, and since human systems are complex leaders can’t rely on the newest software or systems to make innovation a reality. Innovation requires leaders to provide clear incentives for creativity and to underwrite failure, because the key to innovation is being able to fail.

Most organizations sit on a veritable gold mine of data, but turning data into a competitive advantage often requires reorganizing both human and technological systems. The Clearing specializes in human-centered design, organizational transformation, and building cultures that engender success. If that success is to exploit the opportunities inherent in big data, then it will have develop more modern human systems.

Interested in learning about how The Clearing supports organizations that mine big data? Reach out to me at nathan.toronto@theclearing.com to start a conversation.