Louise Ferreira

Hi, I'm Louise — and this is the unfiltered version

You know those bios that make everything sound inevitable? Like someone always knew exactly where they were going? This isn't that. My path to AI research has been messy, full of doubt, and shaped by experiences that had nothing to do with Python or neural networks.

I'm a Black, queer woman from Mesquita in Baixada Fluminense — a place most people in tech have never heard of, but one that taught me everything about what it means when systems don't see you. When algorithms don't account for people like you. When "objective" technology somehow manages to be anything but.

"I didn't come to AI to build cool things. I came because I got tired of being an afterthought in other people's 'innovations.'"

So yeah, that's the lens I bring to this work. Not just the technical chops (though I've got those), but the lived experience of navigating spaces that weren't built with me in mind. And that combination? That's what drives everything you'll see in my research.

The Journey So Far

Not a resume — just the moments that shaped how I think about technology

The Beginning

Mesquita, Baixada Fluminense

Growing up in Baixada wasn't the tech origin story people expect. No coding camps or robotics clubs. But it gave me something more important: an understanding of what happens when you're invisible to the people making decisions that affect your life.

I watched my community get reduced to crime statistics. Saw friends with brilliant minds never get the chance to prove it. That's where my obsession with fairness started — long before I knew what an algorithm was.

2015-2019

International Relations & Finding My Voice

I started studying International Relations at UFSC thinking I'd work in diplomacy or policy. Turns out, I was actually learning how power works — who gets heard, who gets ignored, and how systems perpetuate inequality even when people running them think they're being neutral.

That critical lens? It stuck with me. And later, when I got into tech, I realized algorithms are just policy written in code.

2019-2020

Poland, Culture, & Starting to Connect the Dots

Moving to Poland for my MA in Cultural Studies was when things started clicking. I was studying how culture shapes identity, how narratives get constructed, who gets to tell stories and who gets written out of them.

At the same time, I was watching AI systems make the same mistakes — erasing marginalized voices, encoding bias as "objectivity," treating complex humans like data points. I couldn't unsee the parallel.

2020-2023

Falling Into Tech (Kind Of)

Cybersecurity wasn't some grand plan. I needed a job, and EY was hiring. But working in data loss prevention and compliance — GDPR, LGPD, all those alphabet soup regulations — taught me something crucial: technical solutions without ethical frameworks are just expensive ways to automate harm.

I also realized I was good at this. Really good. Not just the technical stuff, but understanding the human side — why certain communities need protection, what "security" means when you've historically been surveilled instead of safeguarded.

Working at Revolut, CloudWalk, and other fintech companies showed me how fast tech moves — and how often ethics gets left in the dust.

2023

The Moment It All Made Sense

Here's the thing nobody tells you: sometimes your research topic finds you because you stumble into something that pisses you off enough to do something about it.

For me, it was seeing how content moderation algorithms penalize LGBT creators and sex workers — disproportionately, "objectively," algorithmically. Terms that should be neutral got flagged as inappropriate. Queer content got shadowbanned. And everyone just shrugged like "well, that's the AI."

Nah. That's bias. That's human prejudice at scale. And suddenly, everything I'd learned — cultural studies, policy, tech, my own lived experience — clicked into place. I could actually do something about this.

2024-Now

Building the Future I Want to See

Right now, I'm finishing my MSc in AI at the University of Essex as a Chevening/Lemann Scholar. My research focuses on algorithmic fairness in adult content classification — basically, teaching machines not to be bigots.

But it's bigger than that. I'm working on cultural data mining, building datasets for underrepresented communities, and trying to prove that intersectional analysis isn't just a "soft skill" — it's a technical necessity.

I think the best way to fix broken systems is to build better ones. And I've got ideas.

The Parts That Don't Fit on a Resume

🏳️‍🌈

Queer in Tech

Being openly queer in tech spaces — especially as a Black woman from Brazil — means constantly translating yourself. Explaining why certain "edge cases" matter. Why content moderation that targets LGBT language isn't a bug, it's oppression with a better UI.

My queerness isn't separate from my research. It is my research. When I talk about bias in classification systems, I'm talking about algorithms that would flag me.

✊🏾

Black, Brazilian, & Unapologetic

I'm from Mesquita. I'm Black. I'm Brazilian. And I refuse to minimize any of that to make people comfortable. These aren't fun facts about me — they're the reason I understand what's at stake when AI systems fail.

My community taught me resilience. My identity taught me what bias looks like in practice. And my work? That's me fighting back with code.

🌍

Global Experiences, Local Heart

I've lived in three countries, studied in four universities, worked across continents. But home is still Baixada. Always will be.

That tension — between global ambition and local roots — keeps me grounded. I don't want to build AI for Silicon Valley. I want to build AI for the people who get left behind when tech "moves fast and breaks things."

What Drives Me

01

Justice Over Optimization

I'm not interested in making algorithms 2% more accurate if that 2% comes at the expense of marginalized communities. Fairness isn't a metric to game — it's the whole point.

02

Representation Isn't Enough

Being the only [fill in the blank] in the room is exhausting. I don't want to be the token. I want to build systems where people like me don't have to constantly prove we belong.

03

Community Before Career

Every door that opened for me, someone kicked down first. That's why I mentor, why I do DEI work, why I refuse to "make it" and forget where I came from. We rise together, or not at all.

So, That's Me

Messy, multidisciplinary, maybe a little too passionate about making sure algorithms don't screw over the people I love. I didn't take the traditional path to AI research, and honestly? I'm glad. Because the field needs people who understand that technology isn't neutral — it's as biased as the humans who build it.

My job is to make it less biased. My mission is to prove that intersectional, justice-driven AI isn't just possible — it's necessary.

Want to know more about the actual work? Check out my projects. Or if you want to talk about any of this — hit me up. I'm always down for a conversation about building better systems.