Black roboticists on racism, bias, and building better AI

Black roboticists on racism, bias, and building better AI


Jasmine Lawrence works with the Everyday Robots project from Alphabet’s X moonshot factory. She thinks there’s a lot of unanswered ethical questions about how to use robots and how to think of them: Are they slaves or tools? Do they replace or complement people? As a product manager, she said, confronting some of those questions can be frightening, and it brings up the question of bias and the responsibility of the creator. Lawrence said she wants to be held accountable for the good and bad things she builds.

“I want to be called out. I want to be yelled at. I want to be challenged. I want to be encouraged to use renewable energy, or I want to hear from people who are allies and advocates for communities that my personal work might be negatively affecting,” she said. “I will admit that I have blind spots … and I’d love to see that from every builder, every inventor, every creator, every student, just that ownership that I might hurt someone. I know I didn’t try, I’m not being intentional, but it just might happen. So there’s a lot of soul searching, and there’s a lot of actions that we can truly take.”

University of Michigan professor and Laboratory For Progress director Chadwick Jenkins said the idea that keeps him up at night is the thought that robotics won’t be used to advance humanity, create more jobs, or aid in space exploration but to make cheaper products and reduce human labor needs. He noted that the Voting Rights Act and Moore’s law are contemporaries, both coming out in 1965.

“[W]hat you’ve seen is both an exponential increase in our computational power as given by Dennard scaling and Moore’s law, but you’ve also seen an exponential increase in incarceration,” Jenkins pointed out. “Michelle Alexander goes through this in The New Jim Crow, but I think I’m worried about the future Jim Crow, which is what I worry we’re building using AI technology and large scale computing.”

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

Lawrence and Jenkins shared their optimism, concerns, and thoughts about next steps on Tuesday evening as part of a panel discussion about race, robotics, and AI put together by the nonprofit Silicon Valley Robotics. The conversation focused on how to improve experiences for Black students in STEM education and how the tech sector can better reflect the diversity found in the United States. Silicon Valley Robotics managing director Andra Keay acted as moderator of the conversation.

Historic protests in recent weeks against institutional racism and the killings of Breonna Taylor, George Floyd, and other Black people have led to some police reforms and renewed commitments by tech giants. In STEM education, tech, and AI, progress toward equitable diversity has been especially slow, however.

Also on the panel was Maynard Holliday, cofounder of the Defense Innovation Unit, a team helping the Pentagon with emerging technology that was created during the Obama administration. He wants to see executive compensation directly tied to diversity and inclusion metrics and to hold corporate Boards of Directors accountable for poor progress toward diversity.

He also endorses the idea of an algorithmic bill of rights to give people certain inalienable rights when dealing with artificial intelligence. Such a document would ensure transparency and give people the right to know when an algorithm is being used to make a decision about them, the right to redress if an algorithm makes a mistake, and freedom from algorithmic bias. Laws proposed last year, like the Algorithmic Accountability Act and the data privacy law introduced in Congress, would also require bias testing or outlaw algorithmic bias. The idea of a Hippocratic oath for AI researchers has also come up in the past.

READ ALSO  Second Interstellar Object Discovered in the Solar System After ‘Oumuamua – TechEBlog

Jenkins wants to see academic institutions take a company’s diversity record into account when considering staff requests to take sabbatical to work there.

“[T]here are lots of university faculty that take leaves and sabbaticals to work at companies that have not had great representation. I can think of just a few examples, like OpenAI or Google’s DeepMind. Maybe universities shouldn’t offer sabbatical leaves to faculty that are working at those companies,” Jenkins said. “That’s a placeholder measure, but at the end [of the day] it’s about how do you affect the funding that is allowing us to pick winners and losers.”

Jenkins also endorsed following the guidance in an open letter from blackincomputing.org published earlier this month. The letter says Black people in machine learning know what it’s like to be treated differently and acknowledges that “the structural and institutional racism that has brought the nation to this point, is also rooted in our discipline.” The letter is signed by Jenkins, more than 100 other Black people in AI and computing, and more than 100 allies.

“We know that in the same way computing can be used to stack the deck against Black people, it can also be used to stack the deck against anyone,” the letter reads. “We see AI and big data being used to target the historically disadvantaged. The technologies we help create to benefit society are also disrupting Black communities through the proliferation of racial profiling. We see machine learning systems that routinely identify Black people as animals and criminals. Algorithms we develop are used by others to further intergenerational inequality by systematizing segregation into housing, lending, admissions, and hiring practices.”

The letter also contains calls to action and demands to uphold existing civil rights law like the Civil Rights Act of 1964; Title IX of the Education Amendments Act, which ensures no exclusion based on gender; and the Americans With Disabilities Act of 1990. Jenkins believes enforcing those laws could help address a lack of diversity in education.

“[A]t a university, our product is our ideas and people, and we are funded by tuition and public funding, so I think we should try to represent that public and the future people better by shaping that economic incentive,” Jenkins said.

Making space for Black people in robotics

Jenkins believes incentive structures within organizations must also change to give people a reason to promote diversity that does not place people who are passionate about diversity at a disadvantage.

“I know if I care about diversity and equal opportunity, I will have to make a professional sacrifice to provide the mentorship and effort needed to broaden participation in our field,” he said.

Socializing with people in positions of privilege within the existing power structure can be an important part of gaining access to funding, hiring, publishing, citations, and other things associated with advantages in the peer review process. If doing diversity work has no economic incentive or doesn’t make a person more attractive for things like hiring, promotion, or tenure, “then it will move down that stack and we’ll never really address it.”

Monroe Kennedy is an assistant professor in mechanical engineering at Stanford University, where he leads the Assistive Robotics Manipulation laboratory (ARM). On the panel, he spoke from the perspective of an educator in academia, asserting that engaging young people is essential for computer scientists.

“I can tell you beyond a shadow of a doubt, and only, I think, educators know this: When you look into a child’s eyes and you see that moment, you literally could have changed the direction that that person might go,” he said after describing an encounter with a young Black student in a classroom. “We who do the research in this space, who do the amazing things that we do, we have a profound responsibility to go into these spaces and change the status quo and realize the power that a few words — and more importantly, your time — has when it comes to making a difference at that level.”

READ ALSO  Amazon announces New World closed beta on July 23 and unveils combat system

He knows that experience from the other side, too. Being told he’s good enough to be a professor someday by graduate advisors — neither of whom were Black — is what led Kennedy to become a professor. “I have self-confidence, but it’s different when the person that you respect and is in that leadership role looks into your eyes and sees that special thing as well,” he said.

Reiterating Lawrence’s remarks, Kennedy and other Black roboticists on the panel also talked about the importance of starting with discovery of your own bias. Nobody, Lawrence said, is immune to bias.

“I’m not immune to it just because I’m a Black woman. I’m not bias-free or discrimination- or judgment-free. So I would say, acknowledge them, discover them, challenge yourself, and recognize where you have those areas of growth,” she said.

She also joined multiple members of the panel who stressed that the onus of diversity initiatives should not be placed disproportionately on Black employees who, like her, may want to avoid getting labeled by coworkers as a social justice warrior instead of a thoughtful product manager who believes diversity should be a priority.

“Making space for me to solve problems or for us to congregate as minorities — I just don’t see the progress there, and I do feel the fear of alienating my co-workers or seeming like I have any idea how to fix this or that I have everything going on,” she said.

Jenkins said he’s supportive of people willing to speak publicly on the need for diversity, but said he’s heard personally from people who are unwilling to speak up or participate in discussions around diversity without being labeled as angry. “I would say that a lot of Black people in engineering, robotics, [and] AI still have trouble coming up and speaking the truth from their perspective. I think a lot of people still feel like there will be a penalty, they will be labeled as angry or uncivil or worse. I’ve heard worse terminology come up,” Jenkins said.

‘No Justice, No Robots’

Tom Williams also spoke on the panel. Williams is White and a professor at the Colorado School of Mines. In an open letter published online earlier this month titled “No Justice? No Robots,” Williams and other roboticists in Colorado pledged not to create robots of any kind for police.

“We have to not just refuse to build robots that actively cause harm, which seems obvious … but I would also argue that we should be refusing to build even benign or socially beneficial robots in collaboration with or for the use of police because of what that action would communicate to our community about our moral principles,” he said.

“If we choose to build robots for or with the police, regardless of how socially beneficial those robots might be, this action does two things. First, it launders the reputation of the institution of policing. And two, it condones the existence and behavior of that institution, which is deeply problematic,” Williams said.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com