logo
#

Latest news with #marginalizedGroups

From fans in war paint to 100M streams, DJ duo The Halluci Nation talks Indigenous dance floors
From fans in war paint to 100M streams, DJ duo The Halluci Nation talks Indigenous dance floors

CBC

time14-06-2025

  • Entertainment
  • CBC

From fans in war paint to 100M streams, DJ duo The Halluci Nation talks Indigenous dance floors

Three-time Juno award winning duo Bear Witness and Tim '2oolman' Hill, former members of A Tribe Called Red, say they feel like they've managed to create safe spaces for Indigenous people and other marginalized groups at their performances. However, they say their wider appeal has presented some challenges at their shows, which illustrate a lack of shared experience between Indigenous and non-Indigenous people.

AI job recruitment tools could ‘enable discrimination' against marginalised groups, research finds
AI job recruitment tools could ‘enable discrimination' against marginalised groups, research finds

ABC News

time07-05-2025

  • Business
  • ABC News

AI job recruitment tools could ‘enable discrimination' against marginalised groups, research finds

Artificial intelligence hiring systems are increasingly being used by Australian employers to screen and shortlist job candidates, but new research finds this technology creates serious risks of discrimination. AI systems promise to save employers time and money in the recruitment process by using cutting-edge technology, such as CV scanners and vocal assessments, to "classify, rank and score" job applicants. This means a computer program could be assessing a job seeker's application right now, and accepting or rejecting it on the basis of its machine understanding before the person reaches an interview stage with a human. Yet new research from Natalie Sheard, a lawyer and postdoctoral fellow at the University of Melbourne, has found AI hiring systems may "enable, reinforce and amplify discrimination against historically marginalised groups". "There are some serious risks that are created by the way in which these systems are used by employers, so risks for already disadvantaged groups in the labour market — women, job seekers with disability or [from] non-English speaking backgrounds, older candidates," she tells Informative, jargon-free stories about law reform, legal education, test cases, miscarriages of justice and legal culture. About 62 per cent of Australian organisations used AI "extensively or moderately" as part of their recruitment processes last year, according to the Yet Australia does not currently have any specific laws to regulate how these tools operate or how organisations may use them. It was hoped AI would In one example, an AI system developed by Amazon learned to downgrade the applications of job seekers who used the word 'women's' in their CVs. The AI tools used in hiring Dr Sheard interviewed 23 people as part of her research into AI hiring systems. Participants were mainly recruiters who had worked at small, medium and/or large organisations, both private and public, and in a range of industries. She also spoke to two careers coaches to understand the impact of AI hiring practices on job candidates as well as a leading Australian AI expert, and two employees of a large AI developer — the director of AI Services and the AI ethics leader. The use of AI tools in recruitment is on the rise among Australian organisations. ( Getty: Zia Soleil ) Her focus was on three aspects of recruitment screening: CVs, candidate assessments (which may include psychological or psychometric tests) and video ("robo") interviews. Robo interviews typically involve candidates recording themselves answering a series of questions, which are then assessed by AI. Dr Sheard says there are "It had a look at the facial features and movements of applicants to assess their behaviour, personality. [For example] it was looking to see [if] they [were] enthusiastic or angry when they spoke to customers." In 2019, America's Electronic Privacy Information Center filed a complaint against a third party recruitment agency, HireVue, over software it used to analyse video interviews with applicants, arguing the results were " Facial analysis has been How does AI hiring systems impact marginalised candidates? AI hiring tools can experience data bias, Dr Shead says, since the system learns from the information it is fed. Some AI hiring systems are built using large language models, for example, and if they are missing datasets from a disadvantaged group, "they won't be representative" of the broader population, the academic says. This is similarly the case if there are biases in the AI system's training data, which means it adopts and reproduces the Australia has no specific laws to regulate how AI hiring tools operate. ( AAP: Bianca De Marchi ) "They're trained on things that are scraped from the internet. For example, we know … only about 15 per cent of women contribute to articles on Wikipedia. So it incorporates that male view, which can then be brought through into recruitment processes." One example of learned gender discrimination occurred at Amazon when it Photo shows Dilhara Sivalingam sits in a white dress before her laptop during an interview with Nassim Khadem in Melbourne in February 2025. As Australia's unemployment rate rises, finding a job is getting more challenging. Some describe it like online dating. "That model learnt to systemically discriminate against women, because it's a male-dominated field. [Many of] those CVs came from men, and so the system learnt to downgrade the applications of women who applied for positions through that tool, particularly when they use the word 'women's' in their application. For example, [if] they [wrote they] attended a women's college," Dr Shead says. "And it also picked up language styles that are used by men, so when particular words were used more typically by men … like executed or actioned, it upgraded those applications." The recruitment tool was reportedly found to be sexist and ultimately scrapped. AI tools under the microscope in the US In America, several complaints allege AI tools have discriminated against job applicants from different backgrounds. In one case, it was claimed the hiring software used by HireVue in the recruitment process resulted in The complaint claims DK, who is Indigenous and is a Deaf woman who speaks with a deaf accent, had been working for her employer for more than five years and was encouraged by her supervisor to apply for a seasonal manager position at her company. Photo shows Image of Dr Karl on a pink background and Listen app logo Dr Karl knows the best app for free podcasts, radio, music, news and audiobooks … and you don't need to be a scientist to find it! She had the experience and qualifications needed for a promotion, but she claims she was rejected for the role after completing an automated video interview and assessment with HireVue. HireVue uses automated speech recognition systems to generate transcripts based on the audio of the interview and it's alleged these systems perform worse for non-white people and also deaf or hard of hearing speakers. "The way that these systems [assess your communication skills] is by assessing how well you speak standard English … [so] if you speak English as a second language and use non standard English when you speak, then you're likely to be assessed as not having good communication skills by these systems," Dr Sheard says. After DK was informed that she did not get the promotion, she received an email with feedback about how to improve on the HireVue assessment, including a direction to "practice active listening". "That makes you question what kind of human oversight was provided for this whole process? Because at some point a human should have intervened and said, 'Well, this is just at odds with what we know about this person,'" Dr Shead says. In a separate case, a tutoring group that was recruiting in the US was found to have programmed a hiring system to automatically reject female applicants over 55 years of age and male applicants over 60 years of age, which resulted in the rejection of 200 qualified candidates. Photo shows Laptop on desk with chatGPT displayed on screen Artificial intelligence tools such as ChatGPT are being used by jobseekers to write cover letters and resumes. The employer was taken to court by the Equal Employment Opportunity Commissioner, an action that resulted in a $US365,000 settlement. Another complaint raised by the American Civil Liberties Union Foundation (ACLU) against a provider of AI hiring systems that design, sell, and administer online assessments to employers, alleges the assessments discriminate on the basis of disability and/or race. One of the ACLU's complaints is that an algorithmically driven Adaptive Employee Personality Test adversely impacts "autistic people, otherwise neurodivergent people, and people with mental health disabilities such as depression and anxiety". The foundation says this is because it tests for characteristics that are "close proxies of their disabilities" and which are "likely not necessary for essential job functions for most positions". It adds the scores of those applicants with disabilities are likely to be significantly impacted by those characteristics. "If you're a job seeker who experiences depression, you're probably not going to score highly on positivity, so that is really going to be a proxy for a disability that you might have and may filter you out of the process when you might be able to perform all of the essential requirements of the role," Dr Sheard says. What about AI hiring tools in Australia? Similar legal action has not yet taken place in Australia but the Merit Protection Commissioner, which reviews employment decisions such as job progression, has offered guidance for employers in the public sector using AI hiring systems. There are warnings AI recruitment tools on the market may not have been thoroughly tested. ( Unsplash: Austin Distel ) It came after its commissioner overturned 11 promotion decisions made by government agency Services Australia in a single recruitment round during the 2021-22 financial year. Applicants were required to pass through a sequence of AI assessments, including psychometric testing, questionnaires and self-recorded video responses. No human was involved in the process or review, and the commissioner found the system led to meritorious applicants missing out on promotions. The Merit Protection Commissioner has since warned that not all AI recruitment tools on the market have been thoroughly tested, nor are they guaranteed to be completely unbiased. "I think there's absolutely a need to regulate these AI screening systems," Dr Sheard says. "Some groups have called for a complete ban on these systems, and I think there is a lot of merit to that argument, particularly while we're in a situation where we don't have proper legal safeguards in place, and where we don't really understand the impacts of these systems on already marginalised groups in the labour market." While employers may argue these AI hiring systems create a more efficient recruitment process, Dr Sheard says this argument needs to be balanced with "the risks of harming marginalised groups". In February, the House Standing Committee on Employment, Education and Training released a Read more from the Law Report The "The government proposed that there should be some mandatory, what they call, guardrails for high-risk applications of AI. And high-risk applications are generally considered to include these AI screening systems," Dr Sheard says. "That process hasn't concluded and is somewhat up in the air with the election …[so] there's no time frame for that." In the meantime, Dr Sheard believes the Australian government needs to be reviewing its anti-discrimination laws to make sure that they are still "fit for purpose" and cover "these new technologies, particularly around liability". Want to go beyond the news cycle? Get a weekly dose of art, books, history, culture, technology, politics and more with the ABC Radio National newsletter Your information is being handled in accordance with the Email address Subscribe

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store