We bring news that matters to your inbox, to help you stay informed and entertained.

Terms of Use and Privacy Policy Agreement

WELCOME TO THE FAMILY! Please check your email for confirmation from us.
“It is vital that Congress ensures there are guardrails in place that protect American workers, especially workers of color,” Senator Chuck Schumer tells theGrio.
After President Joe Biden signed the first-ever executive order on artificial intelligence, members of Congress are pushing forward to ensure the Black community is not harmed by AI technology.
“There’s a lot of work that we should be doing [at] this moment to make sure that AI isn’t weaponized against communities of color – Black communities,” U.S. Rep. Yvette Clarke, D-N.Y., told theGrio.
The congresswoman said it’s important to ensure “there isn’t bias baked into the products and services that the average everyday family needs to progress.” She continued, “Whether that is going online to obtain a mortgage to purchase a house, or..dealing with financial aid for students for college.”
In a statement provided to theGrio, Senate Majority Leader Chuck Schumer, D-N.Y., said, “It is vital that Congress ensures there are guardrails in place that protect American workers, especially workers of color.”
“We know that there are far too many potential biases within AI technologies that could unfairly impact people of color,” he added. “Congress must lead the charge in creating policy that will protect civil rights in the age of AI.”
Schumer vowed, “I am committed to passing bipartisan legislation that protects all Americans.”
The latest pronouncements on Capitol Hill come after President Biden issued a new executive order on “Safe, Secure, and Trustworthy Artificial Intelligence” earlier this week. The order established protocols to protect Americans from fraud, invasion of privacy, and violation of civil rights by AI technology.
Mutale Nkonde, CEO of AI For the People, told theGrio, “The executive order is really going to mobilize the Congressional Black Caucus around these issues.”
“Even if they don’t understand it, they are going to understand policing and differential policing, and they are going to want to act,” said Nkonde.
She added that “AI systems are used everywhere” and that Biden’s presidential order could help protect Black Americans from issues such as bias in maternal health.
“When we look at Black [maternal] mortality, one of the underlying reasons for that is pain medication is delivered by AI,” Nkonde explained, “and in many settings when Black women actually complained of pain, they were not being given it at the same rate as white women.”
Nkonde believes the executive order neglects to “govern with civil rights in mind.”
“The executive order calls out against algorithmic racism and discrimination,” she said, “But if you go to the very next section labeled ‘Criminal Justice,’ it also then says we’re going to try and make policing technologies and surveillance technologies fairer.”
Nkonde added, “The truth of it is, because of the underlying issues with police and with surveillance, an AI system isn’t going to make it better.”
Clarke told theGrio that the biases mentioned by Nkonde exist due to flawed algorithms.
“A lot of that is driven by algorithms, and those algorithms tend to be developed by individuals who don’t come from the lived experiences of people of color,” said the congresswoman.
“Or they may hold or harbor certain bias against communities of color, Black communities in particular…or may just be uninformed.”
Clarke told theGrio that she believes her Algorithmic Accountability Act, introduced last year, will “mitigate some of the damages that people are experiencing” with AI technology.
In Sept., Clarke and Senators Cory Booker, D-N.Y., and Ron Wyden, D-Ore., introduced the act to ensure that Americans did not face discrimination at the hands of automated systems, whether it pertained to health care, buying a home or getting accepted into college.
In a press release, Booker said, “As algorithms and other automated decision systems take on increasingly prominent roles in our lives, we have a responsibility to ensure that they are adequately assessed for biases that may disadvantage minority or marginalized communities.”
In recent months, members of Congress have worked on new A.I. legislation. 
In October, Senators Chris Coons, D-Del., and Amy Klobuchar, D-Minn., released a draft of a bill called the “No Fakes Act” that would allow music artists to bring a civil lawsuit against the unauthorized use of their likeness. 
For example, earlier this year, a producer named Ghostwriter created a song titled “Heart on My Sleeve” using AI technology that imitated the voices of rap artists Drake and The Weeknd. If the bill had been enacted at the time the song was published, the musical artists’ label, Universal Music Group, would have been able to bring a civil case against Ghostwriter.
In July, U.S. Reps. Anna G. Eshoo, D-Calif., Michael McCaul, R-Texas, Don Beyer, D-Va., and Jay Obernolte, R-Calif., introduced the “Creating Resources for Every American To Experiment with Artificial Intelligence Act of 2023 (CREATE AI Act).”
The act’s main function is establishing the National Artificial Intelligence Research Resource, which would provide researchers and students with a space to develop “safe and trustworthy artificial intelligence.”
TheGrio is FREE on your TV via Apple TV, Amazon Fire, Roku, and Android TV. Please download theGrio mobile apps today!

source