The racial bias impact on children in foster care

Disproportionality is described as the proportion of racial minority children in child welfare in relation to their representation in the general population (Kokaliari, Roy, & Taylor, 2019; Hill, 2006). African American, Native American, and Hispanic continue to be overrepresented in the child welfare system (Miller, Cahn, & Orellana, 2012).  There are known disparities in the treatment of these children in comparison to their white counterparts including longer stays in care, increased risk of re-victimization and placement stability, and less frequent placement permanency (Huggins-Hoyt, Briggs, Mowbray, & Allen, 2019). 

Research suggests that race can impact decision-making among medical professionals to make a report as well as subsequent actions of child welfare workers in investigations (Miller et al., 2012). Miller et al. (2012) and Kokaliari et al. (2019) each suggest other areas that additionally impact decision-making, both highlighting lack of trust by families and the impact of poverty on family resources.

In New York State there is a disproportionality of minority children in care.  Minority children make up a much higher percentage of children in care despite their overall percentage of the total population.  (The Annie E. Casey Foundation, 2019; NYS OCFS, 2019).  Families who interact with the child welfare system need to know the factors that contribute to the ongoing issue of disproportionality for entering and remaining in care in order to be better prepared to collaborate with the system. By having an improved understanding parents may better navigate the process, reduce the length of time in care, and therefore reduce the trauma to the family and children.  In addition, it may encourage more people to become foster parents in order to increase both available placements, including racially and culturally similar homes.

This infographic provides statistics regarding the number of racial minority children in foster care in 2018 and as it compares to their representation in the general population as well as frequency of being placed in similar racial/cultural homes to the family of origin. The graphic also illustrates how racial bias impacts minority children in care and makes broad recommendations to improve services.  The graphic concludes with a website that provides data, recommendations and tools benefit children.

References

Hill, R. B. (2006). Synthesis of research on disproportionality in child welfare: An update. Retrieved from Casey-CSSP Alliance for Racial Equity in the Child Welfare System https://www.cssp.org/reform/child-welfare/other-resources/synthesis-of-research-on-disproportionality-robert-hill.pdf.

Huggins-Hoyt, K.Y., Briggs, H.E., Mowbray, O., Allen, J. L. (2019). Privatization, racial disproportionality and disparity in child welfare: Outcomes for foster children of color.  Children and Youth Services Review, 99, 125-131.

Kokaliari, E.D., Roy, A.W., & Taylor, J. (2019). African American perspectives on racial disparity in child removals. Child Abuse & Neglect, 90, 139-148.

Miller, K.M., Cahn, K., & Orellana, E. R. (2012).  Dynamics that contribute to racial disproportionality and disparity: Perspectives from child welfare professionals, community partners, and families. Children and Youth Services Review, 34, 2201-2207.

New York State Office of Children and Family Services (2019). 2018 Monitoring and  Analysis Profiles with Selected Trend Data: 2014-2018. Office of Strategic Planning and Policy Development/Bureau of Research, Evaluation and Performance Analysis. Retrieved from https://ocfs.ny.gov/main/reports/maps/counties/New%20York%20State.pdf [accessed 8 November 2019]

The Annie E. Casey Foundation Child Kids Count Data Center. (n.d.) Population by race in New York.  Retrieved from https://datacenter.kidscount.org/data/tables/103-child-population-by-race?loc=1&loct=2#detailed/2/34/false/37/68,67,12,66,72/423,424 [accessed 29 November 2019]

When did we go from trying to program a VCR to using smart phones to make mental health diagnosis?

Remember when VCR’s came out and you could record a show to watch later?  It was amazing to think we never again had to miss our favorite shows because we had to go to school or use the bathroom!  And remember how we would roll our eyes at our parents because they were old and didn’t get it just because they needed help programming the VCR?

From young children through young adulthood, the youth of today are growing up in a culture of technology.  The innovations they currently use daily and the ones they will be part of developing almost seem like fantasy  in comparison to the invention of the VCR.  It is safe to say technology is here to stay, and if we are to stay relevant as clinicians, teachers, and care givers we need to not only use it but embrace it if we have any chance of connecting to the younger generations and remaining relevant.

Some people question the large impact technology has on the interpersonal skills of the younger generation and feel they should use it less rather than older generations learning to use it more.  That would be fine if technology was simply a tool for entertainment or socializing. Instead technology is what makes the world go around and without staying on top of the latest trends and tools, older generations could find themselves at a disadvantage in the workplace, possibly losing out job opportunities to younger more tech savvy employees.  Further, without closely studying how companies use technology to make decisions you may find yourself the victim of big data.

What is “big data” do you ask?  Big data is the thousands of pieces of information that are being collected daily and used to make decisions from what product are offered, what ads you see online, and even what results populate to your google search.

Pretty scary right?  It can be and yet for every Yin there is a Yang.  In social work, particularly in mental health, big data can and is being used to advance research by creating bigger data pools across several regions and subsets, making the results potentially more generalizable and useful.  Nicola Davies, PhD, wrote an article in 2016 discussing the availability of large data sets that can be quickly and easily compared through statistical analysis and how they are being used to help with suicide prevention  in terms of predicting suicidal behavior.  To go further, in 2017 Psychology Today published an article, “Will Big Data Save Psychiatry”, wherein Paul Raeburn discusses the potential impact on proper Schizophrenia diagnosis for clients.  Researchers were able to use big data to diagnosis Schizophrenia with a less than 10% margin of error.  Given the difficulties in accurately diagnosing psychosis, and the potential impact on clients when inaccurate diagnosis are made either way, this could be game-changing for mental health.

A screenshot of a cell phone

Description automatically generated

So how does a computer do this?  How does it take large amounts of big data and use it to make diagnosis and assessments?  In two words-artificial intelligence.  Or in simpler terms, these machines learn.  They take in the data they are provided, apply it to the framework they were programmed with, and use this information to make educated predictions or decisions.  Think about it.  Isn’t that how we as humans make decisions? We take the information we are given, compare it to what we already know, and decide based on these two points of data.  In humans that is intelligence, for computers it’s artificial intelligence.  That phrase is scary to some-like the characters in sci-fi movies of old are coming to life.  Yet if you were one of those people who needed an accurate diagnosis for the best possible treatment in order to live your most fulfilling life, wouldn’t you want providers to have the tools to get it right?

Artificial intelligence is being used in the area of trauma as well.  PTSD is most often diagnosed through clinical interview or self-report, both means prone to bias on behalf of the clinician and the client.  Working with Veterans, researchers were able to identify 18 of 40,000 unique biophysical features of speech that could predict with 89% accuracy a diagnosis of PTSD.  Not only is this helpful with proper identification of trauma symptoms, it can be particularly helpful in Telemedicine where in depth interviews over time are not possible for people who live in outlying areas and do not have the available resources to access mental health providers and treatment. 

See the source image

As amazing as all of this is, artificial intelligence and big data do not come without their challenges.  Larger amounts of client data can also skew outcomes, and patient confidentiality needs to be respected at all costs.  Researchers looking at these resources need to ensure their technology and protocols are keeping with the rights of privacy and autonomy for clients.  Even more important is the objectiveness of those programming the computers making the decisions.  Unfortunately it is possible that human bias can be part of the programmed algorithms through what data is fed into the system, what pieces of information the computer is asked to look at, and how much weight is given to some data points such as socioeconomic status, race, age, and sex.  So even though supporters of increased technology boast computer generated data is better as it is more objective than human nature, it is important to remember algorithms can be as biased as its programmers. 

Even with these challenges, however, two things are evident.  First big data and artificial intelligence are here to stay and will only be utilized more and more over time.  Second, these tools have potential to open new doors and new information that can enhance the quality of people’s lives exponentially.  If we as social workers continue to have a louder and more involved voice in the development of these tools future generations may only dream about some of the mental health struggles that our clients face today.  The possibilities are endless……..

Design a site like this with WordPress.com
Get started