By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Private Banks RankingPrivate Banks Ranking
Notification Show More
Latest News
The Top Canadian REITs to Buy in December 2023
Is Constellation Software Stock Worth a Buy in December?
This 8% Dividend Stock Pays Cash Every Month
Retirees: 2 Reliable Stocks for Steady Passive Income
Got $1,000? 3 Stocks to Invest in for December 2023
Aa
  • Finance
  • Business
  • Banking
  • Investing
  • ETFs
  • Mutual Fund
  • Personal Finance
  • 2022 RANKING
Reading: Watching AI Software for Bias
Share
Private Banks RankingPrivate Banks Ranking
Aa
  • Finance
  • Business
  • Banking
  • Investing
  • ETFs
  • Mutual Fund
  • Personal Finance
  • 2022 RANKING
Search
  • Finance
  • Business
  • Banking
  • Investing
  • ETFs
  • Mutual Fund
  • Personal Finance
  • 2022 RANKING
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Private Banks Ranking > Blog > Watching AI Software for Bias

Watching AI Software for Bias

By 3 months ago
Share
4 Min Read
SHARE

Artificial intelligence has become virtually synonymous with generative AI in public discussion. But AI is a much bigger topic. What is likely the most used in CRE as well as in other industries is machine learning.

As IBM describes it: “Through the use of statistical methods, algorithms are trained to make classifications or predictions, and to uncover key insights in data mining projects. These insights subsequently drive decision making within applications and businesses, ideally impacting key growth metrics.”

The techniques are widespread across many industries, including engineering, manufacturing, image recognition, autonomous vehicles, automatic translators, data analysis, and many other places.

When you hear that proptech software uses “AI,” machine learning is probably of the most widely used techniques frequently employed in recognizing patterns and implementing classifications. If you employ software to pull important data out of spreadsheets or documents, to identify properties that have characteristics you seek in a potential investment,

All well and good except for one aspect. Unlike investing, the past really is a guide to the future. The initial training documents have enormous implications for how the system will work.

Here’s an example passed on by a marketing expert. A cable company decides to better focus its messages and attention to improve its outreach to customers. So, it looks for the most frequent and engaged users. Unfortunately, the most intent users of the cable company’s programming are those customers that are out of work and can’t afford additional services.

That may seem frivolous, but the issue is called bias. Just like people can have cognitive biases, so can machine learning systems because they are the products of people and are trained by people on decisions that people have made in the past.

See also  In debate over appraisal bias, rival researchers clash over key data

A famous example is that of Amazon. The company set up a system to review resumes for potential hires, as Reuters reported a few years ago. The hope was that the software would process resumes and suggest the best possible hires.

Unfortunately, the software wouldn’t make decisions in a gender-neutral way. The problem? The training materials were the resumes the company had received for years as well as the hiring decisions made. There had already been a gender bias problem in hiring and now it was built into the software. The project was ultimately scrapped.

This isn’t the only example. In late 2019, the National Institute of Standards and Technology released the results of a face recognition software test in which “false positives rates often vary by factors of 10 to beyond 100 times” between samples of different racial faces. Again, whatever the issue was — something in the software algorithms, training samples, hardware, or some other reason, there was repeated racial bias.

This is all to show that with the best of intentions, things can go badly wrong. It would be an act of wishful thinking to assume CRE software employing machine learning couldn’t also exhibit bias of whatever kind.

A complicating factor is that CRE firms are typically using software written by third parties to which you won’t have access to details that might prove important in diagnosing the problem.

The thing to do is experiment. Track results. Build your own database of examples and then see whether they present patterns of bias. Because it won’t just be embarrassing, but open companies to potential legal jeopardy.

See also  What Shark Week can teach investors about recency bias

Source link

You Might Also Like

Is Constellation Software Stock Worth a Buy in December?

Citi will pay $26M to settle CFPB claims of bias against Armenian Americans

What Is Overconfidence Bias? Can It Harm Your Investment Returns?

Up 29 Percent in 2023, Will Constellation Software Stock Continue to Surge?

TestFit: Prologis Is Using Our Software to Identify New Warehouse Sites

TAGGED: bias, Software, watching
September 1, 2023
Share this Article
Facebook Twitter Email Print
Share
Previous Article Stocks making the biggest moves after hours: DELL, LULU, MDB Stocks making the biggest moves after hours: DELL, LULU, MDB
Next Article Goldman, regional bank warnings show the Fed is now playing offense Goldman, regional bank warnings show the Fed is now playing offense
Leave a comment

Leave a Reply Cancel reply

You must be logged in to post a comment.

Private Banks RankingPrivate Banks Ranking
Follow US

© 2022 Private Banks Ranking- 85 Great Portland Street,W1W 7LT, London. All Rights Reserved.

  • Blog
  • Contact
  • Privacy Policy
  • Terms & Conditions
Join Us!

Subscribe to our newsletter and never miss our latest news, podcasts etc..

I have read and agree to the terms & conditions
Zero spam, Unsubscribe at any time.

Removed from reading list

Undo
Welcome Back!

Sign in to your account

Lost your password?