[ad_1]
A leading IT industry body has warned that human reviews of AI decisions are in need of legal protection.
BCS, The Chartered Institute for IT, made the warning amid the launch of the ‘Data: A New Direction’ consultation launched by the Department for Digital, Culture, Media and Sport (DCMS).
The consultation aims to re-examine the UK’s data regulations post-Brexit. EU laws that were previously mandatory while the UK was part of the bloc – such as the much-criticised GDPR – will be looked at to determine whether a better balance can be struck between data privacy and ensuring that innovation is not stifled.
“There’s an opportunity for us to set world-leading, gold standard data regulation which protects privacy, but does so in as light-touch a way as possible,” said then-UK Culture Secretary Oliver Dowden earlier this year.
DCMS is considering the removal of Article 22 of GDPR. Article 22 focuses specifically on the right to review fully automated decisions.
Dr Sam De Silva, Chair of BCS’ Law Specialist Group and a partner at law firm CMS, explained:
“Article 22 is not an easy provision to interpret and there is danger in interpreting it in isolation like many have done.
We still do need clarity on the rights someone has in the scenario where there is fully automated decision making which could have a significant impact on that individual.”
AIs are being used for increasingly critical decisions, including whether to offer loans or grant insurance claims. Given the unsolved issues with bias, there’s a chance that discrimination could end up becoming automated.
One school of thought is that humans should always make final decisions, especially ones that impact people’s lives. BCS believes that human reviews of AI decisions should at least have legal protection.
“Protection of human review of fully automated decisions is currently in a piece of legislation dealing with personal data. If no personal data is involved the protection does not apply, but the decision could still have a life-changing impact on us,” added De Silva.
“For example, say an algorithm is created deciding whether you should get a vaccine. The data you need to enter into the system is likely to be DOB, ethnicity, and other things, but not name or anything which could identify you as the person.
“Based on the input, the decision could be that you’re not eligible for a vaccine. But any protections in the GDPR would not apply as there is no personal data.”
BCS welcomes that the government is consulting carefully prior to making any decision. The body says that it supports the consultation and will be gathering views from across its membership.
Related: UK sets out its 10-year plan to remain a global AI superpower
(Photo by Sergey Zolkin on Unsplash)
Find out more about Digital Transformation Week North America, taking place on 9-10 November 2021, a virtual event and conference exploring advanced DTX strategies for a ‘digital everything’ world.
[ad_2]
Source link