The Dilemma Facing Today's Job Candidates
Job candidates are placing their data in your hands. Are you treating it with the necessary care?
By Colin Day
In a world where location-based apps are following you long after they're turned off, smart home devices are picking up more than you realize, and Google knows you better than your best friend, you can start to feel like you're living in an episode of the popular British sci-fi drama Black Mirror.
However, for most corporations, data is currency. As such, I'm sure we'll continue to see tech vendors barreling towards advancements in artificial intelligence (AI) like these each day, sometimes with little sensitivity toward the preferences -- or concerns -- of their human users.
Yet, as technology providers, it's our responsibility to take a step back from the noise made by emerging innovations and return to the basic principles of good business.
Within the recruiting industry, "candidate experience" has been a hot topic for some time. And as the demand for talent continues to increase, the power shifts deeper into the job candidate's hands. Considering that one bad hire costs nearly $17,000 on average, employers simply can't afford to overlook the importance of the recruiting function any more.
Now, organizations finally recognize that they must have an employer brand that is unique and engaging, while also offering an application process that's both mobile and social media-friendlyand as fast and easy as possible. But we rarely stop to consider that the data candidates share during this process could now be getting more exposure than they had ever anticipated or signed up for.
It shouldn't have to be this way. Applying for a job is an admirable act of vulnerability. People are really putting themselves out there, hinging their ability to work -- their livelihood -- on whether their resumes gets noticed or a hiring manager thinks they're a fit within the first 90 seconds of an interview. There's a lot of pressure to be transparent about past experiences, long-term career plans, and salary history. And now with the added importance of a candidate's social media presence, even more personal details are open to scrutiny within the hiring process.
Here's the real point of concern: When a candidate submits a job application, there is an implied contract of trust. Applicants expect that their data will remain protected and treated with respect throughout the entire hiring process. Rightfully so.
In general, more than half of Americans say they trust businesses with their personal information online. In the U.S., there is certainly a pervasive belief in the rights of individuals over most aspects of his or her life, but that doesn't seem to translate so clearly as it applies to big data. In terms of legislation, there are still major gaps in the way we empower citizens to control their online information.
On the other side of the Atlantic, the EU's upcoming General Data Protection Regulation takes a major stride toward preserving candidate privacy. Regarding the hiring process, the mandate specifies and strengthens the rights of the candidate (data subject), adding transparency to data processing and stronger indicators of consent, as well as the obligations of employers (data controllers) and talent acquisition and HCM software providers (data processors). Any company based in the U.S. that does business in or accepts job applicants from the EU will also see the effects of the regulation.
Government legislation cannot and will not be able to keep up with constantly shifting data practices and daily innovations in AI. But it does teach us a lesson about empowerment -- offering candidates more transparency into the online hiring process and the ability to decide who can have access to their personal data. And it begs the question: Are employers doing enough to ensure their applicants are being treated with care?
The majority of workers (70 percent) have searched for a new job while at their workplace. So chances are, your current candidate pool is filled with people who are currently employed. On top of that, two thirds (65 percent) of job seekers are worried their coworkers or employer will find out they are looking for a new job, putting their current means of earning a living in jeopardy.
This is already happening. Startups like hiQ Labs scrape data from public LinkedIn profiles to determine if an employee is going to quit, and then sell that information to employers as engagement data. Sure, this information is valuable, and different rules and agreements apply to an employer-employee relationship versus that of a potential employer and their applicants. But I'd argue that how you treat available candidate data is equally as reflective of your business ethics as how you treat employee data, for better or worse.
Nearly everyone avoids companies that they don't believe protect their privacy. These people are your customers, your partners, and of course, your employees. As technology providers in the business of connecting people with companies, we must especially consider where the emerging advancements in data mining, AI and blockchain technology should and shouldn't be going, and come to an agreement before we cross a very sensitive line.
Simply put, job seekers expect employers to treat applicants with the same respect as current employees. And it's hard to argue with that. As much as we want to make the recruitment process easy, it's much more nuanced thanks to its position as the public-facing part of HR. To enable consumer trust, employers and their technology partners must come to a consensus on the ethics behind AI in hiring, and give applicants better controls over their data.
We owe it to them.
Colin Day is the chairman and CEO of iCIMS.