Prejudice

Prejudice

We tend to hold attitudes, assumptions, or expectations about people based on group membership, often without realizing it. While prejudice is commonly linked with explicit acts of bias, it's the more subtle, unconscious forms that quietly shape our decisions, perceptions, and interactions.

The word “prejudice” usually brings to mind open hostility or discrimination, but cognitive science has shown a more complex and widespread reality. Essentially, prejudice means pre-judging others, forming opinions about individuals or groups without enough knowledge. While explicit prejudice happens consciously (and is becoming more stigmatized), implicit prejudice silently influences us in the background of our thinking, often without us realizing.

In the late 20th century, psychologists started looking beyond what people said they believed to focus on how they actually behaved, especially in situations that required quick decisions or gut reactions. In 1998, researchers Anthony Greenwald, Debbie McGhee, and Jordan Schwartz introduced the Implicit Association Test (IAT) to measure subconscious biases. Participants were asked to quickly sort words and images, and their response times revealed deep-rooted mental associations, such as linking certain races or genders with specific traits or roles.

This research showed that even people who explicitly endorsed egalitarian and progressive views often exhibited implicit biases that contradicted their stated beliefs. These biases are shaped by cultural exposure, media, education, and experiences, and can exist even without any malicious intent. They function more like cognitive shortcuts that allow us to make rapid decisions, but at the cost of fairness and objectivity.


Prejudice manifests in everyday decisions, such as hiring, mentoring, performance evaluations, and even casual conversations. A resume with a “foreign-sounding” name might be judged differently. A teammate’s idea might be dismissed based on age, gender, or background. Over time, these small unfairnesses add up to systemic disadvantages. Recognizing this doesn’t mean we’re bad people; it means we’re human. But it also means we have a responsibility to break these patterns when we see them.

In team settings—especially cross-functional ones—prejudice can quietly undermine collaboration, trust, and innovation. Unlike explicit bias, which is easier to recognize and combat, implicit bias exists in the gray areas: who gets heard in meetings, who receives the “stretch” projects, or whose mistakes are forgiven versus remembered.

For example, a senior engineer might unconsciously defer to another teammate who has a similar background or communication style, even if that person is less qualified. A product manager might assume that marketing lacks technical depth or that junior team members are less likely to have strategic insights. These assumptions aren’t openly expressed, but they influence behavior: who gets included in early discussions, whose feedback is prioritized, and whose voices fade into the background.

This bias can influence performance reviews and promotion decisions. A manager may describe one employee as a “natural leader” and another as “not quite ready,” despite similar outputs. These are subtle differences that often correlate with identity markers like race, gender, or age. Even well-meaning teams can unknowingly recreate systemic barriers if they fail to recognize how these patterns operate.

🎯 Here are some key takeaways:

Don’t confuse intent with impact

You don’t have to mean harm to cause happen. Prejudice can influence how we treat others even when we believe we're being fair. Acknowledging this helps us take responsibility for outcomes, not just intentions.

Slow down your snap judgments

Sorry tech bros, but Implicit bias thrives in speed and ambiguity. When you feel yourself making a quick assessment—about someone’s competence, attitude, or fit—pause and ask: “What evidence am I basing this on?” Being reflective helps reduce bias.

“Fit” is not a free pass for exclusion

Hiring or promotion decisions often fall back on the idea of “fit.” But without clear definitions and rubrics, that word becomes a proxy for familiarity. Replace it with structured, job-relevant criteria.

Track who gets heard, not just who speaks

It’s easy to miss patterns when everyone is “contributing.” Track who gets airtime in meetings, whose suggestions become actions, and who gets interrupted. These metrics reveal more than intention ever will.

Bias often feels neutral. Interrogate your instincts

If a decision feels like a “gut call,” slow down. Ask: is this based on evidence, or comfort? Often, our instincts are shaped by cultural norms, not objectivity.

Subscribe to get a new bias in your inbox every Friday!

    We will not SPAM you. Pinky swear!

    Type at least 1 character to search

    Thanks for signing up!

    Wil you help keep the show independent and ad free?

    Buy me a coffee

    $ 5
    • My heartfelt thanks
    • One time charge