Back Back

University secures funding to address lack of security and trustworthiness of AI

18/06/2021
University secures funding to address lack of security and trustworthiness of AI

A team at the University of Wolverhampton is creating an innovative new platform to help financial organisations and banks to add security, trust and explainability to their Artificial Intelligence (AI) based decisions.

They went through a rigorous selection process to be part of the Cyber Security Academic Start-ups Accelerator Programme (CyberASAP) aimed at supporting the commercialisation of cyber security research.

CyberASAP provides university teams with the necessary expertise and support to convert their academic ideas into commercial products and services in the cyber security landscape. Funded by the UK Government's Department for Digital, Culture, Media and Sport (DCMS), in partnership with Innovate UK and KTN (Knowledge Transfer Network), CyberASAP is now in its fifth year.

The project ‘TrustMe: Secure and Trustworthy AI platform for FinTech’, is due to run until the end of July 2021.

Led by Dr Ali Sadiq and Professor Prashant Pillai of the University's Wolverhampton Cyber Research Institute (WCRI), the team has been awarded more than £31,000 to develop a market proposition and carry out market validation of their technology.

Lecturer in Computer Science Hiran Patel is also part of the team.

Professor Pillai said: “Out of the 23 teams that were selected at the start of CyberASAP, we are really pleased to be one of the 14 teams that have been chosen by an independent panel to continue to the market validation stage of the programme.

“This stage will help us validate our value proposition and engage with key stakeholders and beneficiaries to further help shape our product features and business plans.”

Dr Sadiq said: “AI algorithms are being used to make several decisions like credit rating, loan/mortgage risk assessments, fraud checks, etc.

“However standard AI adopted by most financial organisation is not really suitable for regulated financial services, as we cannot explain how the AI algorithms derive the results.

“There is a growing lack of trust on the AI based decisions being made and there is a rise of wrong, or biased decisions occurring due to poor data quality.

“There has been a rise in UK insurers being fined by the FCA due to incorrect decisions made by AI algorithms and May 2020 has seen the first a lawsuit over an AI based FinTech (financial technology) system.”

The TrustMe platform aims to provide an innovative platform that would help organisations to ensure the quality of their data and AI training, and provide the means to build interpretability and auditability of AI algorithms.

It would be the first platform of its kind that would use explainable AI that can be trusted to be fair and accurate decisions.

The team has created an online survey to gather feedback from professionals in the FinTech sector, along with AI experts and end users to help further understand the problem and the features they would like to see in the product.

The survey can be found here.

The new £9 million Cyber Quarter - Midlands Centre for Cyber Security in Hereford has now been completed and is ready to open.

The trail-blazing project on Skylon Park, Hereford Enterprise Zone, is a joint venture between the University of Wolverhampton and Herefordshire Council and part-funded by the Marches Local Enterprise Partnership (LEP) and the European Regional Development Fund (ERDF).

The Centre will offer a package of tailored security testing, training, research and development, and sector expertise to businesses and investors.

For more information contact: Ali Sadiq (Ali.Sadiq@wlv.ac.uk) or Professor Pillai (P.Pillai@wlv.ac.uk)

For more information please contact the Corporate Communications Team.

Share this release

Related Stories