Original Article By Matt Agorist At TheFreeThoughtProject.com

If you’ve traversed the world of social media or other technology platforms, you know the problem with algorithms and how they can lead to ridiculous actions by Artificial Intelligence. While most algorithms are used to monitor, track and gather data on users for advertising purposes, other more insidious algorithms are used to censor those who question the status quo.

We’ve seen situations in which people were banned from social media platforms for completely benign posts that are entirely apolitical and non-offensive. Because the algorithm lacks all nuance and discretion, its often irreversible decisions lay waste to entirely innocent actors. It is for this reason that the algorithms used or being considered by child welfare agencies across the country should send chills down your spine.

In the following example, the consequences are far graver than having a post taken down or facing a ban on Twitter. The consequences of these algorithms could involve losing your children.

According to a new study, social workers with child protective services in Pittsburgh are using an algorithm to decide whether or not parents should have their children taken from them.

“A lot of people don’t know that it’s even being used,” attorney Robin Frank told the AP in a story covering the algorithms. “Families should have the right to have all of the information in their file.”

As the AP reports:

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

According to new research from a Carnegie Mellon University team obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time.

Though the CMU study focused on Allegheny County, similar tools are being used or considered in California, Colorado, and Oregon.

Families whose information is entered into these algorithm intelligence tools are subject to the AI’s results and a whopping two-thirds of them are flagged to be investigated — mostly black.

“It’s not decreasing the impact among Black families,” said Logan Stapleton, a researcher at Carnegie Mellon University. “On the point of accuracy and disparity, (the county is) making strong statements that I think are misleading.”

Because the algorithm’s results are kept secret, the AP reports that families and their attorneys can never be sure of the algorithm’s role in their lives because they aren’t allowed to know the scores. This is something activists are trying to change.

Allegheny County officials says the algorithms only aid workers in making decisions exactly how much aid remains a mystery.

While it is important to protect children, an algorithm falsely flagging a family for abuse and subsequently using that information to take their children is a slippery slope. This is why Illinois decided to stop using it.

What’s more, if these children are removed from their families for frivolous reasons, the data shows they may end up in utterly horrifying situations.

According to the government’s own data, the vast majority of sex trafficked kids are coming from the government system who promises to keep them safe.

This system is set up to pull children from their families for ridiculous reasons and turn them over to for profit systems—funded by your tax dollars—that use these children as cash cows and have no incentive to keep them safe. These algorithms could amplify this problem.

In 2019, National Center for Missing & Exploited Children (NCMEC) assisted law enforcement with over 23,500 cases of missing children, all of whom were considered endangered runaways.

According to their most recent report complied from FBI data and their own, of the nearly 24,000 runaways reported to NCMEC in 2019, one in six were likely victims of child sex trafficking. Of those, 88 percent were in the care of social services when they went missing.

How many of these children were falsely flagged by algorithms?

In the AP’s report, officials in Allegheny County defend the algorithm and state that humans make the final decision. However, humans are often completely careless and reckless and this was shown by the FBI in a 2014 nationwide raid that discovered many foster children rescued from sex traffickers, including children as young as 11, were never reported missing by child welfare authorities.

So, while algorithms certainly have their place in making our lives easier and more efficient, when they begin to make decisions that could rip families apart, perhaps we should consider treading lightly.

Article Link:

Merch- Show Off And Start A National Conversation: