checkAd

     141  0 Kommentare Financial industry must tackle gender bias in algorithms, according to global fintech leader, Finastra

    - Finastra publishes five-point plan to tackle algorithmic bias in consumer finance decision-making

    - New KPMG report, commissioned by Finastra, examines the size of global consumer lending markets and the potential impact of algorithmic bias in society

    - Finastra urges the financial industry to address the problem and work together to help solve it

    LONDON, March 4, 2021 /PRNewswire/ -- Finastra, one of the world's largest fintechs, is calling upon the global finance industry to tackle algorithmic bias which is likely to be impacting millions of people every day. The fintech firm, which supplies vital technology to financial institutions of all sizes, including 90 of the world's top 100 banks, recently commissioned consultancy firm KPMG to look at the issue across banking, lending and insurance. The research considered how decisions coming from this advanced technology have the potential to impact outcomes for certain people and groups. In response to the findings, Finastra has published a five-point plan to identify and tackle algorithmic bias and is urging the financial industry to come together to take action and build a fairer society.

    Finastra_Logo

    In the past decade, the financial world has been industrialized and digitalized through the introduction of artificial intelligence (AI), particularly forms of machine learning, boosting efficiencies and automating processes, resulting in many parts of banking, lending and insurance decision-making processes now being made by algorithms. The pandemic has accelerated the use of these technologies and whilst it brings clear positives, these vital algorithms can only be as 'fair' and unbiased as the data sets that are used to build them. The industry must check if the biases that exist in society are being repeated through the design and deployment of these technologies.

    To understand the severity of the problem, Finastra commissioned KPMG to produce a report which reveals the sheer size of consumer lending markets and the potential impact of algorithmic bias. For example, in 2020, consumer lending and transactions across key financial products (credit card, other consumer product lending and mortgage/home lending) were over:·

    • $6,110bn in the U.S.
    • HK$1,270bn in Hong Kong
    • £440bn in the United Kingdom
    • €280bn in France
    • SG$110bn in Singapore

    Both the provision and costs – e.g. the interest rates charged – to consumers of this credit will be informed in many cases by the algorithms that are used.

    Seite 1 von 3



    PR Newswire (engl.)
    0 Follower
    Autor folgen
    Verfasst von PR Newswire (engl.)
    Financial industry must tackle gender bias in algorithms, according to global fintech leader, Finastra - Finastra publishes five-point plan to tackle algorithmic bias in consumer finance decision-making - New KPMG report, commissioned by Finastra, examines the size of global consumer lending markets and the potential impact of algorithmic bias in …