Trending...
- Maryland Agriculture Secretary Kevin Atticks to Lead Regional Ag Association as Incoming President
- PUBLIC NOTICE Maryland Agencies to Treat Potomac River for Black Fly Control on June 27
- IRF Builders Forum Brings Global Leaders to Washington, D.C. to Advance Religious Freedom Through Cooperative Engagement
COLLEGE PARK, Md., April 30, 2021 /PRNewswire/ -- Can we partner with artificial intelligence and machine learning tools to build a more equitable workforce? New research co-authored by Margrét Bjarnadóttir at the University of Maryland's Robert H. Smith School of Business may offer a way.
Early attempts to incorporate AI into the human resources process haven't been met with resounding success. Among the most well-known fails: In 2018, Amazon.com was forced to abandon an AI recruiting tool it built, when it was discovered to be discriminating against female job applicants.
In the research, recently awarded the Best White Paper award at the 2021 Wharton Analytics Conference, Bjarnadóttir and her co-authors examine the roots of those AI biases and offer solutions to the challenges they present – solutions that could unlock the potential of analytics tools in the human resources space. They for example recommend creating a bias dashboard that parses a model's performance for different groups, and they offer checklists for assessing your work – one for internal analytical projects, and another for adopting a vendor's tool.
"We wanted to look at the question of: How can we do better? How can we build toward these more equitable workplaces?" says Bjarnadóttir, associate professor of management sciences and statistics at Maryland Smith.
More on Marylandian
It wouldn't be simple. The biases stem from an organization's HR history. In Amazon's AI case, the analytics tool was found to be rejecting resumes from applicants for technical job roles because of phrases like "women's chess team." With the technical jobs long dominated by men, the model had taught itself that factors correlated with maleness were indicators of potential success.
"We can't simply translate the analytical approaches that we use in accounting or operations over to the HR department," Bjarnadóttir says. "What is critically different, is that in contrast to, say, determining the stock levels of jeans or which ketchup brand to put on sale, the decisions that are supported in the HR department can have an instrumental impact on employees lives; who gets hired, who receives a promotion, or who is identified as a promising employee."
And because the data it draws from is historical, it's tough to amend. In other contexts, there are more obvious remedies. For example, if a skin-cancer-detecting AI tool fails to detect skin cancers in darker skin tones, one could input more diverse images into the tool, and train it to identify the appropriate markers. In the HR context, organizations can't go back in time and hire a more diverse workforce.
And the fact that typically HR data is what is called unbalanced, meaning, not all demographic groups are equally represented, causes issues when our algorithms interact with the data. A simple example of that interaction is the fact that analytical models will typically perform best for the majority group, because a good performance for that group simply weights the most in the overall accuracy – the measure that most off-the-shelf algorithms optimize.
More on Marylandian
So even if your models are carefully built - If your data aren't balanced, even carefully built models won't lead to equal outcomes for different demographic groups. For example, in a company that has employed mostly male managers, a model is likely to identify men disproportionately as future management candidates - in addition to correctly identifying many men. And the algorithm is more likely to overlook qualified women.
So, how can they "do better," as Bjarnadóttir says, in future hiring and promotion decisions, and ensure that applications are unbiased, transparent and fair? The first step is to apply a bias-aware analytical process that asks the right questions of the data, of modeling decisions and of vendors, and then both monitors the statistical performance of the models, but perhaps more importantly, monitors the tool´s impact on the different employee groups.
Visit Smith Brain Trust for related content at http://www.rhsmith.umd.edu/faculty-research/smithbraintrust and follow on Twitter @SmithBrainTrust.
About the University of Maryland's Robert H. Smith School of Business
The Robert H. Smith School of Business is an internationally recognized leader in management education and research. One of 12 colleges and schools at the University of Maryland, College Park, the Smith School offers undergraduate, full-time and part-time MBA, executive MBA, online MBA, specialty masters, PhD and executive education programs, as well as outreach services to the corporate community. The school offers its degree, custom and certification programs in learning locations in North America and Asia.
Contact: Greg Muraski at [email protected] or 301-892-0973.
SOURCE University of Maryland's Robert H. Smith School of Business
Early attempts to incorporate AI into the human resources process haven't been met with resounding success. Among the most well-known fails: In 2018, Amazon.com was forced to abandon an AI recruiting tool it built, when it was discovered to be discriminating against female job applicants.
In the research, recently awarded the Best White Paper award at the 2021 Wharton Analytics Conference, Bjarnadóttir and her co-authors examine the roots of those AI biases and offer solutions to the challenges they present – solutions that could unlock the potential of analytics tools in the human resources space. They for example recommend creating a bias dashboard that parses a model's performance for different groups, and they offer checklists for assessing your work – one for internal analytical projects, and another for adopting a vendor's tool.
"We wanted to look at the question of: How can we do better? How can we build toward these more equitable workplaces?" says Bjarnadóttir, associate professor of management sciences and statistics at Maryland Smith.
More on Marylandian
- $12.8 Million Net Revenue for 2024 for Cloud-Based Crowdsourcing Recruitment and SaaS-Enabled HR Solutions Provider: Baiya International Group Inc
- YYNOT Brings High-Energy RUSH Tribute to the Weinberg Center this September
- Hire Virtue Announces Executive Sponsorship Opportunity for Houston Hiring Blitz & Job Fair on August 6, 2025
- Inked & Maxim Model Teisha Mechetti Turns Heads—And Builds Community Impact
- Plan to Launch Silo Technologies' Cybersecurity Pilot Program for Ultimate Nationwide Deployment via Exclusive Partnership: Stock Symbol: BULT
It wouldn't be simple. The biases stem from an organization's HR history. In Amazon's AI case, the analytics tool was found to be rejecting resumes from applicants for technical job roles because of phrases like "women's chess team." With the technical jobs long dominated by men, the model had taught itself that factors correlated with maleness were indicators of potential success.
"We can't simply translate the analytical approaches that we use in accounting or operations over to the HR department," Bjarnadóttir says. "What is critically different, is that in contrast to, say, determining the stock levels of jeans or which ketchup brand to put on sale, the decisions that are supported in the HR department can have an instrumental impact on employees lives; who gets hired, who receives a promotion, or who is identified as a promising employee."
And because the data it draws from is historical, it's tough to amend. In other contexts, there are more obvious remedies. For example, if a skin-cancer-detecting AI tool fails to detect skin cancers in darker skin tones, one could input more diverse images into the tool, and train it to identify the appropriate markers. In the HR context, organizations can't go back in time and hire a more diverse workforce.
And the fact that typically HR data is what is called unbalanced, meaning, not all demographic groups are equally represented, causes issues when our algorithms interact with the data. A simple example of that interaction is the fact that analytical models will typically perform best for the majority group, because a good performance for that group simply weights the most in the overall accuracy – the measure that most off-the-shelf algorithms optimize.
More on Marylandian
- Robert Michael & Co. Real Estate Team Celebrates Industry Recognition and Showcases Premier Central Florida Listings
- AI-Based Neurotoxin Countermeasure Initiative Launched to Address Emerging National Security Needs: Renovaro, Inc. (N A S D A Q: RENB)
- The Naturist World Just Shifted — NaturismRE Ignites a Global Resurgence
- PUBLIC NOTICE UPDATE: Maryland Agencies to Treat Potomac River for Black Fly Control on June 30
- Maryland: Urban Agriculture Advisory Committee Meeting Notice
So even if your models are carefully built - If your data aren't balanced, even carefully built models won't lead to equal outcomes for different demographic groups. For example, in a company that has employed mostly male managers, a model is likely to identify men disproportionately as future management candidates - in addition to correctly identifying many men. And the algorithm is more likely to overlook qualified women.
So, how can they "do better," as Bjarnadóttir says, in future hiring and promotion decisions, and ensure that applications are unbiased, transparent and fair? The first step is to apply a bias-aware analytical process that asks the right questions of the data, of modeling decisions and of vendors, and then both monitors the statistical performance of the models, but perhaps more importantly, monitors the tool´s impact on the different employee groups.
Visit Smith Brain Trust for related content at http://www.rhsmith.umd.edu/faculty-research/smithbraintrust and follow on Twitter @SmithBrainTrust.
About the University of Maryland's Robert H. Smith School of Business
The Robert H. Smith School of Business is an internationally recognized leader in management education and research. One of 12 colleges and schools at the University of Maryland, College Park, the Smith School offers undergraduate, full-time and part-time MBA, executive MBA, online MBA, specialty masters, PhD and executive education programs, as well as outreach services to the corporate community. The school offers its degree, custom and certification programs in learning locations in North America and Asia.
Contact: Greg Muraski at [email protected] or 301-892-0973.
SOURCE University of Maryland's Robert H. Smith School of Business
Filed Under: Business
0 Comments
Latest on Marylandian
- Byrd Davis Alden & Henrichson Launches Independence Day Safe Ride Initiative with 500 Free Uber Credits
- databahn Launches GenAI Sales Intelligence Platform to Revolutionize Fortune 500 and Global 2000 Account Research
- INFINITI HR Expands Business Insurance Solutions: Coverage in Cyber, EPLI, Workers' Compensation
- PUBLIC NOTICE Maryland Agencies to Treat Potomac River for Black Fly Control on June 27
- IRF Builders Forum Brings Global Leaders to Washington, D.C. to Advance Religious Freedom Through Cooperative Engagement
- Colorado Scenthound Locations Partner with Humane Colorado to Give Adopted Dogs a "Clean Start"
- Endoacustica Europe Unveils iPhone 13 Pro Max Spy Phone—Pure Hardware, Zero Software Changes
- Suzanne Harp named Managing Director in Texas, USA
- $10 Million Acquisition of GXR World Sports Assets Energizes Global Launch of Sports.com Super App by Online Lottery-Sports Game Provider: Lottery.com
- Shop American Made Goods: New Online Marketplace My American Goods Curates the Best of U.S. Made
- Investor Spotlight: Cycurion, Inc. (N A S D A Q: CYCU) Secures $69M in Contracts Amid Surging Demand for AI-Powered Cybersecurity Solutions
- $328 Million Global Stroke Rehab Market Opportunity Awaits AI Telehealth Leader Following Selection for NIH Funded Phase 3 Clinical Study: VSee Health
- Ascent Solar Technologies Enters Collaborative Agreement Notice with NASA to Advance Development of Thin-Film PV Power Beaming Capabilities: ASTI
- VoodooSoft Unveils SiriusLLM: The World's First ChatGPT-Like AI Malware Detection Engine
- This Ain't Press. This Is Pressure — Star Command by RansomXX is Out Now
- An Exclusive VIP Reception Honoring Vocal Prodigy Alliana Lili Yang's Remarkable Achievements and Magazine Cover Spotlight
- Joyce Carol Oates Returns to Hard Case Crime With DOUBLE TROUBLE
- New AI Academy Helps Therapists Embrace Tech Without Losing Their Humanity
- Iacocca Institute and Stanton Chase Launch Strategic Partnership for Leadership Development
- IQSTEL Surges Toward $400M Run Rate with $101.5M in Revenue—Reinforces Billion-Dollar Vision Backed by Fintech, AI, and Cybersecurity