From tool fads to compliance challenges, here’s the lowdown on the current state of artificial intelligence from the mouths of industry experts.
Barely a day goes by without some new headline about artificial intelligence. The excitement about it is palpable – the possibilities for talent acquisition, some say, are endless. Fears about it are spreading, too, especially following the twists and turns of the Workday AI discrimination lawsuit.
Since most of us are using AI already, there’s no question around “can” we use it in TA. It can provably craft job descriptions, ‘chatbot’ candidates through the application process, screen resumes, shortlist qualified candidates, write interview questions, evaluate test scores, schedule interviews, and perform any number of low-rent, repetitive tasks at scale. This chart from Metr shows just how quickly AI is advancing in its ability to complete long and complex tasks – doubling approximately every 7 months for the last 6 years. At this pace of evolution, in under five years, generalist autonomous agents will be capable of performing in minutes what it would take a human worker one week to accomplish.

Amid the AI gold rush, TA leaders face a unique challenge. They need to make sense of the latest tools being pitched, figure out what those tools genuinely offer, and ask the right questions to see through the noise.
To help with that, we brought the experts together in a back-to-basics roundtable. Here are their answers to the burning questions you always wanted to ask about AI in talent acquisition.
- Want the full smorgasbord of insights? Watch the roundtable here.
How does artificial intelligence in recruiting work?
“Generative AI doesn’t understand anything at all. It doesn’t know facts or feel emotions. It just predicts the next best word, pixel, or sound based on the patterns that specific model was trained on.” – Matt Staney
It boils down to this: old-school computers do what they are told and follow instructions given to them in the form of code. Generative AI works by leveraging machine learning and technologies like Natural Language Processing to replicate human-like behavior and speech patterns at an unprecedented scale. That’s what makes it able to write, from scratch, entire articles, songs, and even novels in your voice and perform tasks that typically require human intelligence.
Matt Staney, VP of Community at Talentful, says as smart as AI is, the thing to remember is it doesn’t think for itself. It is taught how to analyze information and draw inferences from patterns within datasets, and it’s the pattern recognition that enables it to predict the next best word, sentence, or outcome according to the patterns it was trained on. “That’s why it’s really important, as you’re evaluating tech, to ask them how they train their data,” he says.
Tell me more about AI’s training data. What data is (or should be) used?
“Any enterprise employer has enough historical information where you could train [AI] to extract value for sure.” – Matt Charney
AI models learn patterns and behaviors exclusively from their training data, similar to how students learn from textbooks. Using irrelevant or misaligned data leads to poor model performance: “If you’re buying a sourcing product that was trained and cross-validated using a database of contingent search firm networks and client candidate data, it may not work for your in-house recruiting team,” Stanley says. The more domain-specific the training data, the more accurate and applicable your AI tools are going to be.
Matt Charney, Chief Marketing Officer at Employer.com, says companies already have a significant amount of the training data they need, particularly outcome-based data (i.e., data that shows successful hires and what skills and qualifications they have) – though they are not yet using it to its full potential. While ChatGPT had to absorb subsets of the entire internet, today’s enterprise AI models can be built from much smaller sample sets. Companies may only need a small sliver of highly relevant data, which can then be replicated and scaled around specific scenarios or candidate profiles to fit their specific needs.
Both Charney and Staney believe that, eventually, each company is going to have its own ChatGPT or DeepSeek equivalent — an open AI inherent to that company, trained on its own data, and built using custom large language models (LLMs). This AI would better manage talent acquisition tasks unique to that company and predict how someone is likely to perform in a specific role.
How should talent acquisition approach compliance, and potential bias, in AI?
“Recruiters often ask, can AI screen candidates? The answer is 100% yes. Can it screen well, and compliantly? That depends on the data you use.” – Leah Daniels, Job Sync
When you read about bias in AI, training data is often the source. If the data that was used to train the model did not represent a diverse enough population, or was incomplete, irrelevant or outdated (e.g. internal hiring data from 15 years ago that doesn’t reflect how how job skills have changed), the AI will produce unreliable outputs (“garbage in, garbage out”). That’s another reason to check the vendor’s training model and apply human oversight to the final outcome.
Biases don’t just live in the data, however – they can also be introduced by the algorithms. Cien Solon, CEO and Co-founder of LaunchLemonade, explains: “What are you looking for with your query? What’s in your system prompt? Are you seeking specific experience, or [a candidate] with a growth mindset? These factors can be added to your query set and impact the result of your query. So it’s not just raw data itself that will produce bias, but your query, the user, the system prompt, and the algorithm. There are a lot of things to dissect in these tools that would allow you to understand, am I getting the right output?”
Another compliance concern is the leakage of proprietary data. Even innocent tools like AI note takers can expose sensitive information unintentionally, so you’ll need to think carefully about what data you’re sharing and how it’s being used.
What are the must-have talent tech AI tools?
“The best tool is the one that your recruiters will actually use, see value in, and you get something out of it.” – Matt Staney.
Whether it’s Microsoft Copilot or Google Gemini, chances are high that AI is already built into your productivity stack and they’re a good place to start. These free AI assistants can dramatically reduce the time it takes to write job descriptions, craft interview questions, draft and personalize candidate outreach messaging, summarize candidate profiles and resumes to present to a hiring manager, draft scorecards, and so on.
Beyond that, AI is still too early in the market to either dismiss a tool or love it and use it forever. Our experts were unanimous on the need for talent organizations to experiment – and to choose tools that the whole team (and not just your early adopters) will use. “You have to try them and find the one that works for you,” says Cameron Moore, Employer Brand expert at Snap Inc. “But be specific and consider the nuances ….What do you really need it for as opposed to the shiny new thing?” In other words, put the process before the tool.
Staney recommends that you start with a specific use case, then pilot a small project. “Say, we’re going to solve some inefficiencies in scheduling, and we’re going to test a tool. Calculate how much less time you spend doing that scheduling task, and how much better your recruiter bandwidth becomes. You can tie that to ROI, quality of hire, speed of hire, candidate satisfaction etc.”
Charney makes the point that TA probably shouldn’t be thinking about its use cases in isolation, especially when building internal AI systems. “Everyone in the organization is talking about AI, and the real power comes in working cross functionally – not just looking at a recruiting use case, but an enterprise use case. As an organization, you’re going to have a shared ontology … if you’re defining words the same way, you’re going to get a lot more power out of it.”
Solon agrees: “It’s not about choosing the right model for an entire institution, but a set of models. I say every language model has its own personality…and each language model is good at very specific things. So it’s about finding the right model for the right task.”
What is the best way to train myself, and my team, about the value of AI in recruiting and how to leverage it?
“Your generative AI tool is like a really knowledgeable intern with zero common sense. You have to spell out exactly what you want.” – Matt Staney
AI can deliver hours and weeks of productivity savings for overstretched recruiting teams, and that’s a major selling point for companies. At the same time, TA teams struggle to prove causation between the tasks they do every day and top-line revenue. ‘Hours saved in individual employee productivity’ is not necessarily a barometer of profitability or revenue for the wider business.
At a high level, there needs to be an acute focus on the broader business goal, then work backward to determine how recruiting contributes to achieving that goal, then backwards again to how AI can support that contribution. It’s not a case of ‘having AI’ for the sake of it, but rather using AI as a strategic tool to achieve specific business objectives. Whenever you train a team on the best use of AI, it has to be through that lens. “Look at ‘how can I create a causation to what recruiting does? That’s the ultimate MO,” Charney says.
From a usability perspective, AI needs clear prompts and instructions or it will produce nonsense. As Staney says, “AI doesn’t have common sense. It doesn’t know what it’s like to be a human. It’s just following patterns, following instructions – you have to be very specific and spell it out.”
It’s incumbent on TA teams to understand what they’re asking for, and learn the correct prompts to get those results. As a priority, Moore says “people are going to have to learn how to ask better questions, and that will give you better answers and better outputs.” If you’re going to prove ROI to your business stakeholders, you need to understand the data and how it was obtained.
How are candidates using AI and what should we do about it?
“I’m aware of very cool tools that recruiters and hiring teams can use to detect [if resumes are AI generated]…it’s a cat and mouse where we’re both catching up with each other.” – Cien Solon
Job Sync’s recent Pulse Check on Candidate Application Preferences in 2025 found that half of all candidates are now experimenting with generative AI to optimize and tailor resumes, write personalized cover letters and identify missing qualifications for specific job postings. Some are using AI to submit applications en masse, targeting hundreds of roles with minimal effort. The response from our experts? It’s not the problem many believe it to be.
“If a candidate has trained this AI on their specific experience, and the recruiting team has trained their AI specifically on their culture and hiring process, then are these not extensions of the humans figuring this out so that when it gets to interview, they can actually have a conversation and not do the repetitive screening questions?” Staney says. In his view, any technology that “shortens the process” of recruiting is a win.
For Cameron Moore, it’s a question of context – when is it appropriate for candidates to use AI and when does it get in the way of hiring decisions? “Using AI to make a resume? No problem. Do what you need to do. But using AI to answer a complex question? That’s not going to help us hire,” he says. His company doesn’t allow the use of AI in interviews or coding challenges, and those policies are communicated to candidates up front so there are no surprises.
Solon adds that for every problem AI creates, there’s a solution coming out. Tools are emerging that can detect when candidates are using AI, and essentially filter out the filters. Right now, we’re in the hype phase, but things will level out once the novelty wears off.
Is AI going to take recruitment jobs?
“AI is not going to replace you, but someone mediocre that knows AI will replace you.” – Cameron Moore.
Don’t write a eulogy for recruiting just yet. Charney explains: “While some enterprise employers are drinking the Kool-aid, for AI to work it needs a rational actor. The input has to rationally align to the output, and that could not be more antithetical to what a recruiter actually does day-to-day. As long as people have any sort of complexity; as long as there are variables from ‘trailing spouses’ to ‘You can take your pre-planned vacation before you start,’ there will not be any way to fully replace recruiters.”
He thinks the role will transition towards the end of the process: “offer negotiation, fit, and all that. I think sourcing will probably be displaced because it’s table stakes at this point.”
Staney agrees that “the role is evolving, not vanishing.” He predicts that recruiters won’t be replaced, but all the administrative tasks like screening, scheduling and sourcing at scale will be. The irreplaceable human qualities recruiters bring to the table, such as intuition, relationship-building, empathy, and the ability to manage complex, unpredictable hiring dynamics will still be very much in need, and even the next-generation of autonomous, agentic AI will not replace human connections. “It’s not human versus AI; it’s human plus AI,” he says.
Solon says the real question is not will recruiters lose their jobs, but will they lose their jobs in the way that they understand their jobs to be? “Yes, definitely,” she says, “But I’m betting on recruiters becoming entrepreneurial.” She shares the story of LaunchLemonade’s first customers, recruiters who “built tools that helped hiring managers and teams score for skills, automated checking CVs, and automated checking if CVs were AI generated.” That’s the direction she expects recruiters to go in — becoming consultants with high-level skills.
What is the workforce going to look like in the AI-powered future?
“It goes back to the basis of learning in the first place, like critical thinking skills, being adaptable, and keeping the curiosity of learning.” – Cameron Moore.
We’re already at the point where generative AI can provide an instant answer to almost any question and deliver a step-by-step guide on how to perform or learn a skill. Like it or not, this technology will change the workforce. Those who can use AI effectively to get the highest-quality work done in the shortest amount of time stand to win the most.
Employers will be looking for creativity, adaptability, and learning agility in their workforce — those who can build a set of solutions to problems, not just those who can execute the tasks that AI delivers. “It’s not just recruiters, people will have to learn how to become entrepreneurial as well” Solon predicts. “We’re going to stop hiring for roles and specific new hires, and just look for problem solvers.”