Listening to Black Voices

Information Engineer Derik Perry on Artificial Intelligence, Algothrimic Bias and Social Media

"We have to be aware that social media has the potential to do harm and that, purely by the use of it, people are exposed to an algorithmic bias."

By Heather Dunhill December 14, 2021

This article is part of the series Listening to Diverse Voices, proudly presented by Gulf Coast Community Foundation.

Derik Perry

Derik Perry

As a child, Derik Perry and his family relocated to a largely white community in West Orange, New Jersey, about three miles away from their previous home in Glenridge. It was a transformative experience: the Perrys were the first Black family on their street. Perry's early social integration informed a lifetime of dedication to diversity, equality and inclusion, a common thread throughout his 30-year career as an information engineer.

Perry holds an M.B.A. in international finance from George Washington University; he's also currently working on his Ph.D. in information systems and communications at Robert Morris University. “Other than family,” Perry says, “the time-consuming work of my doctorate is most important because I want to make the world of artificial intelligence safer and better by removing bias and inequities in the technical deployments.”

As if he wasn't busy enough, Perry also serves as an advisor to government C-level and operational executives with the MITRE Corp. in Washington, D.C. His focus is AI oversight with high-stakes decision support, which is used in weapon systems, judicial decisions and medical applications. He also holds top secret security clearance with the departments of Homeland Security, Defense, and Justice.

 During his 15 years with MITRE, Perry created and launched a group within the company called the Black Culture Network. It's focused on being a resource for MITRE to understand Black culture. It makes sense that Perry launched this initiative; he's been mentoring the Black community since his college days, when he joined Alpha Phi Alpha (APA), the first intercollegiate Black fraternity that focuses on social justice. W.E.B. Du Bois is a notable alumnus. Perry is also a member of the Iota Upsilon Lambda chapter of APA in Silver Springs, Maryland, which is referred to as the “Monument Chapter,” because the idea for the Martin Luther King, Jr. memorial on the National Mall began with it. In the early 1990s, Perry the group in lobbying Congress for an extension of the fundraising period allotted for building the memorial.

Perry has also served as chair of George Washington University’s African American Advisory Board and is currently a trustee with The Stone Ridge School of the Sacred Heart in Bethesda, as well as a member of national organizations such as the National Association of Black Engineers. A newcomer to Sarasota, he and his wife Lynn split time here and in Washington, D.C.

What was it like growing up in New Jersey?

“We were the first Blacks in our community, and the first on my street. I attended Mountain High School, which was a public school in West Orange. It was 30 to 40 percent less diverse than my school in Glenridge. For example, in my graduating class of 230, there were seven people of color.

“My dad, who is my biggest influencer by far, taught me how to golf. We lived across the street from the first tee at Francis A. Byrne Golf Course, but we would drive to the Weequahic Golf Course in Newark. I always wondered why, but he wouldn’t explain other than it was more ‘fun.’ I now know that it was because Newark was a highly Black city and our golf experience there was communal. However, I did sneak on and play the course across the street after hours. And I became the first Black captain of the high school golf team.”

Tell us about your family.

“My mom grew up in Boston, and my dad is from a one-traffic-light town in New York called Hilburn, which is right across the New Jersey border. The Perry family is an example of diversity—we are Black, American Indian, Irish, and Ashkenazi Jew.

“My dad’s second cousin, Dwaine Perry, is not only the current mayor of Hilburn, he is also chief of the Ramapough Lenape American Indian tribe. When I was young, my family would go to Hilburn for powwows, an American Indian cultural celebration, which is a rare opportunity. As a kid, I didn’t know I was experiencing that culture, but I was aware when people were oppressing us.

“For example, Ford and other industrial companies, had plants just outside the town where the Ramapough lived. These companies polluted the water and woods, where we played as children, by dumping industrial chemicals such as paints and solvents. It was a cheap way to get rid of it. As a result, many of my Ramapough family members have health issues that persist to this day. HBO followed the tribe during the legal battle in a documentary titled Mann v. Ford.”

How were you exposed to social diversity?

“I became aware of diversity when I went to college. Growing up, I was the odd man out in West Orange. Often, I’d travel to South Orange with my older brother—it was more diverse, and there were more Black girls, so I loved tagging along and hanging out with him because he exposed to me to more of us.

“I also liked hopping in and out different cultures, not just within my family. Growing up, diversity in my life was about 1 percent people of color, as well as Jewish and Italians.

"I started college at Drexel University in Philadelphia before I transferred to George Washington University. I noticed that all the Black girls lived on one floor of the dorm. By that time in my life, I moved in different circles of people. When I was in South Orange, I would hang out with Black people at Jack and Jill parties in their homes, but in West Orange, I would be the only Black kid at an event.

“At Drexel, I noticed that larger groups of Black people would congregate in that white society. They would sit in the ‘Black corner’ of cafeteria, which was a common occurrence at a predominantly white college. This was different from what I had experienced to that point.

“In my sophomore year, I transferred to GWU, where I had many Jewish friends from high school who helped ease me in. When I pledged with the multi-college chapter of Alpha Phi Alpha, I met a variety of Black folks in different circles.

“One night, while hanging out at Georgetown University’s Black Peoples Union’s ‘soul night’ party, I noticed students wearing buttons that said, 'Know Thyself,' which is a Biblical and ancient Egyptian reference that is carved on the Temple of Apollo at Delphi. That made me realize that I had to move differently in the world.

“That, plus my membership in APA, made me take seriously the need to help students of color who needed support on campus and in life. I became a mentor so that the people behind me were supported. GWU was a transformative time for me in many ways.”

You are an information engineer—how do you suggest we re-shape the influencing, algorithmic decision-making of social media? I'm thinking of misinformation, sensationalism or “hate clicks.”

 “That’s an important question. It’s one that we must address, but I honestly don’t have the answer.

“First, we have to be aware that social media has the potential to do harm and that, purely by the use of it, people are exposed to an algorithmic bias. Social media is tuned toward getting an effect—or a rise—out of you. Essentially, it tends to skew negative content, because that negative content will get a reaction, and it will emphasize that over and over again. A search engine is the ultimate search bias. As a result, all people are subject to harmful media stimulus, which can induce public discourse. Not everyone is aware enough to know that they are being fed this information.

“To fix it, we must be aware that bias exists in algorithms. Then we must identify where those algorithmic decisions are made and where inherent bias exists.”

Tell us more about algorithmic bias.

“If the data and information used to develop the algorithm has a bias in it, then it didn’t use diverse populations and it won’t work differently for different populations. This means that the data for the algorithm only looked at subset, including only a certain type of people. Therefore, the decision that comes out of algorithm will be biased.

“One example is certain databases that identify criminals. There’s been a long history of how people of color come up as potential suspects because the results are skewed negatively in identification. The databases present poor outcomes for Black folks. For example, when a system was used to identify a potential criminal in a particular location with a specified history, it would produce more darker-skinned people than others.

“Other examples are the original databases built by Amazon and Google. They first deployed facial recognition systems and services that were historically bad at identifying dark-skinned Black women, or those with Black hair styles. They would inappropriately be mislabeled as male or otherwise. That occurred because the initial training database was populated with mostly white people.

“Governance happens at multiple levels. Everyone who touches AI has an opportunity to do it right. Most believe that the algorithm decisions happen at a high level, such as a board of directors, but it happens at all levels. The person at a keyboard doing the coding is making decisions, as well.  

 How do we fix this?

“We have to ask if the team, at all levels, is diverse enough to consider different viewpoints on how the algorithm is built—and what should be considered. These are high-stakes decisions, especially if they affect a large portion of the population’s happiness, economic situation, civil liberties, and well-being.”

UNESCO’s Director-General Audrey Azoulay said, “The world needs rules for artificial intelligence to benefit humanity.” What say you?

“I firmly agree. Especially since the intent of my doctorate is to make AI safer and innovate ways to do so.

“Suffice it to say, AI is constantly becoming more capable and more powerful and scaling at a rate that is beyond human capability—it becomes self-learning. As AI learns, it gets better, and its slope of growth greatly increases. As such, there’s more opportunity for the application of AI to have a broader impact on the lives of many people, positively and negatively, than any other means.

“Let’s compare it to television, which was created in the late 1920s. It took 20 years for 2 million homes to have one. Now, look at AI and the algorithm’s impact on society. It was created 50 years ago, but rate of increase of its expanse has outpaced TV. In less than 20 years, AI has reached nearly 3 billion monthly Facebook users around the globe. The reach of an algorithm has a faster growth rate than other technologies. If it goes wrong, it’s likely to have a quicker negative impact.”

What would you like your white friends or acquaintances to be doing right now?

“I would like my white friends to listen to diverse voices because they will learn from that. Also, seek out more than one diverse voice in a business meeting. When you have multiple diverse voices, you have built-in allyship. You will hear more from that group than a single person representing diversity.

“One of the things that I do regarding diversity, equality and inclusion, particularly as a trustee at The Stone Ridge School of the Sacred Heart, are joint events with teachers, parents, students, and alumni to openly discussion bias. We all go in with open ears. Black people don’t often want to have those discussions, particularly with white friends. They are afraid of being judged, saying the wrong thing, or being insensitive when expressing discontent.

“Lastly, recognize that if someone is called out on a particular action or racist remark, it’s not a blanket judgement about the person. It doesn’t mean they are racist; however, it should be appreciated as an opportunity to improve.”

Listening to Black Voices is a series created by Heather Dunhill

Show Comments