strategy and
experiments for
rituals and


Feminist AI works to put technology into the hands of makers, researchers, thinkers, and learners to amplify unheard voices and create more accessible AI for all. We create spaces where intergenerational BIPOC and LGBTQIA+ womxn and non-binary folks can gather to build tech together that is informed by our cultures, identities and experiences. We engage with intersectional feminism to spotlight our stories, inventions, designs, and leadership, and to co-create more equitable futures.


Job brief

I volunteered as a social media manager to administer social media accounts. I was responsible for creating original text and video content, managing posts and responding to followers. I managed our company image in a cohesive way to achieve our marketing goals.

As Social media manager, kept all social media platforms to be up-to-date with the latest digital technologies and social media trends. I provieded excellent communication skills and was able to express our company’s views creatively.

Ultimately, I was able to handle our social media presence ensuring high levels of web traffic and customer engagement.


  • Perform research on current benchmark trends and audience preferences
  • Design and implement social media strategy to align with business goals
  • Set specific objectives and report on ROI
  • Generate, edit, publish and share engaging content daily (e.g. original text, photos, videos and news)
  • Monitor SEO and web traffic metrics
  • Collaborate with other teams, like marketing, sales and customer service to ensure brand consistency
  • Communicate with followers, respond to queries in a timely manner and monitor customer reviews
  • Oversee social media accounts’ design (e.g. Facebook timeline cover, profile pictures and blog layout)
  • Suggest and implement new features to develop brand awareness, like promotions and competitions
  • Stay up-to-date with current technologies and trends in social media, design tools and applications


Social Media Manager skills garnered:

  • Proven work experience as a Social media manager
  • Hands on experience in content management
  • Excellent copywriting skills
  • Ability to deliver creative content (text, image and video)
  • Solid knowledge of SEO, keyword research and Google Analytics
  • Knowledge of online marketing channels
  • Familiarity with web design
  • Excellent communication skills
  • Analytical and multitasking skills
  • BSc degree in Marketing or relevant field


Feminist.AI works to put technology into the hands of makers, researchers, thinkers and learners to amplify unheard voices and create more accessible AI for all. We create spaces where intergenerational BIPOC and LGBTQIA+ womxn and non-binary folks can gather to build tech together that is informed by our cultures, identities and experiences. We engage with Intersectional feminism to spotlight our stories, inventions, designs and leadership, and to co-create more equitable futures.


We believe that individuals should be able to understand and have a role in how technology affects their daily lives and communities. Our projects and related programs are for individuals at any level of exposure to artificial intelligence (AI) thinking. We approach each project by questioning assumptions embedded in AI and machine learning (ML) modeling and design approaches, while co-creating our own technologies, in order to make AI thinking accessible to all.


  • We design with and for unheard voices in AI creation. We must be invited to a location to participate and have consent from the communities with which we work.
  • We honor all knowledge systems and skills equally.
  • We acknowledge and own our privilege.
  • We design multiple entry points for involvement so we can pull from different knowledge systems for our design and development.
  • The AI project/research can be either a social response or technical making. We evaluate our methods as we work, revisiting every step of our process with every new project.
  • We believe individuals affected by technologies should be designing and making the technologies they use.
  • The physical (hardware, interaction, experience) and the digital are both key elements in our AI Design. Culture, material, and purpose are just as important as the data and model.
  • We attribute everything, including the people who have come before us and original parallel research.
  • We encourage our community to move beyond framing AI around human intelligence, or human-centered and gendered approaches to AI, and to think about alternative approaches to AI design (posthuman).
  • We probe the knowledge assumptions in AI systems and deliberately deconstruct existing approaches to AI creation, from the data to the rulesets to the output.
  • We contribute to and community source our own data and rules to control our own intelligent futures.
  • We use AI as a tool to both highlight and co-create arts and cultures.
  • We recognize AI as a design material.



  1. Design with and for unheard voices in AI creation.
  2. We must be invited to a location to participate.     
  3. All knowledge systems and skills are equally honored and valued.
  4. Acknowledge and own privilege.  
  5. We want multiple entry points for involvement, so we can pull from different knowledge systems for our design and development.
  6. The AI project/research can be a social response or technical making.  
  7. We would like to revisit every step of our process with every new project.
  8. Individuals affected by technologies should be designing/making the technologies.
  9. The physical (hardware, interaction, experience) and the digital are key elements in our AI Design. Culture, material and purpose are just as important as the data and model.
  10. We attribute everything (people who have come before us, original parallel research).
  11. We encourage our community to move beyond framing AI around human intelligence - human centered, and gendered approaches to AI - and to think about alternative approaches to AI design. (posthuman).
  12. We probe the knowledge assumptions in AI systems, and deliberately deconstruct existing approaches to AI creation (from the data, to the rulesets, to the output).
  13. We work to contribute to and community source our own data, and rules to control our own intelligent futures.
  14. We use AI as a tool to highlight culture.
  15. We recognize AI as a design material.


Books, research, and organizations that inspire Feminist.AI’s work, as well as our own crowdsourced resource guides

Algorithms of Oppression

Safiya Umoja Noble

Purchase Algorithms of Oppression

A revealing look at how negative biases against women of color are embedded in search engine results and algorithms

In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.

Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance—operating as a source for email, a major vehicle for primary and secondary school learning, and beyond—understanding and reversing these disquieting trends and discriminatory practices is of utmost importance.

An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.

Algorithms of Oppression Book Club Resources Guide

Created by the Feminist.AI community

Download the resource guide

This crowd-sourced resource guide was created in conjunction with the Algorithms of Oppression book club, which was co-hosted with the Women’s Center for Creative Work, in alliance with The Free Black Women’s Library– LA.

The resource is an emergent document for Book Club members to contribute and share resources, organizations, ideas, and tools to build our community knowledge and support that center around the concepts found in Algorithms of Oppression by Professor Safiya Noble. Our goal is to continue to be responsive in our programming, not only to our communities interests, but to the fight for Black Lives — especially, the fight for Black Trans Queer Women & Folk. This book club lies in the intersections between tech, feminism, anti-racism, and many other ways that our community shows up. As we collaboratively build these resources together, we hope this becomes a tool to reflect those intersections.

Encode LA Resources Guide

Created by the Feminist.AI community

Download the Encode LA Resources Guide & Collections for Open Community Making

This document was created in conjunction with our Encode kickoff event in February 2020. It is part of Feminist.AI’s 2020-2021 Encode programming, inspired by Algorithms of Oppression. Note: Due to COVID-19, timelines presented in this document are subject to change.

Weapons of Math Destruction

Cathy O’Neil

Purchase Weapons of Math Destruction

A former Wall Street quant sounds an alarm on the mathematical models that pervade modern life — and threaten to rip apart our social fabric

We live in the age of the algorithm. Increasingly, the decisions that affect our lives—where we go to school, whether we get a car loan, how much we pay for health insurance—are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: Everyone is judged according to the same rules, and bias is eliminated.

But as Cathy O’Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination: If a poor student can’t get a loan because a lending model deems him too risky (by virtue of his zip code), he’s then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a “toxic cocktail for democracy.” Welcome to the dark side of Big Data.

Tracing the arc of a person’s life, O’Neil exposes the black box models that shape our future, both as individuals and as a society. These “weapons of math destruction” score teachers and students, sort résumés, grant (or deny) loans, evaluate workers, target voters, set parole, and monitor our health.

O’Neil calls on modelers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it’s up to us to become more savvy about the models that govern our lives. This important book empowers us to ask the tough questions, uncover the truth, and demand change.

Published by Crown Random House

Race After Technology

Ruha Benjamin

Purchase Race After Technology

From everyday apps to complex algorithms, Ruha Benjamin cuts through tech-industry hype to understand how emerging technologies can reinforce White supremacy and deepen social inequity.

Benjamin argues that automation, far from being a sinister story of racist programmers scheming on the dark web, has the potential to hide, speed up, and deepen discrimination while appearing neutral and even benevolent when compared to the racism of a previous era. Presenting the concept of the “New Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite. Moreover, she makes a compelling case for race itself as a kind of technology, designed to stratify and sanctify social injustice in the architecture of everyday life.

This illuminating guide provides conceptual tools for decoding tech promises with sociologically informed skepticism. In doing so, it challenges us to question not only the technologies we are sold but also the ones we ourselves manufacture.

Design Justice

Sasha Costanza-Chock

Purchase Design Justice

What is the relationship between design, power, and social justice?

“Design justice” is an approach to design that is led by marginalized communities and that aims expilcitly to challenge, rather than reproduce, structural inequalities. It has emerged from a growing community of designers in various fields who work closely with social movements and community-based organizations around the world.

This book explores the theory and practice of design justice, demonstrates how universalist design principles and practices erase certain groups of people—specifically, those who are intersectionally disadvantaged or multiply burdened under the matrix of domination (white supremacist heteropatriarchy, ableism, capitalism, and settler colonialism)—and invites readers to “build a better world, a world where many worlds fit; linked worlds of collective liberation and ecological sustainability.” Along the way, the book documents a multitude of real-world community-led design practices, each grounded in a particular social movement. Design Justice goes beyond recent calls for design for good, user-centered design, and employment diversity in the technology and design professions; it connects design to larger struggles for collective liberation and ecological survival.

Purchase Afrofuturism: The World of Black Sci-Fi and Fantasy Culture

Comprising elements of the avant-garde, science fiction, cutting-edge hip-hop, black comix, and graphic novels, Afrofuturism spans both underground and mainstream pop culture. With a twofold aim to entertain and enlighten, Afrofuturists strive to break down racial, ethnic, and all social limitations to empower and free individuals to be themselves.


Ytasha Womack

Purchase Behind the Screen: Content Moderation in the Shadows of Social Media

An eye-opening look at the invisible workers who protect us from seeing humanity’s worst on today’s commercial internet.

Social media on the internet can be a nightmarish place. A primary shield against hateful language, violent videos, and online cruelty uploaded by users is not an algorithm. It is people. Mostly invisible by design, more than 100,000 commercial content moderators evaluate posts on mainstream social media platforms: enforcing internal policies, training artificial intelligence systems, and actively screening and removing offensive material—sometimes thousands of items per day.

Sarah T. Roberts, an award-winning social media scholar, offers the first extensive ethnographic study of the commercial content moderation industry. Based on interviews with workers from Silicon Valley to the Philippines, at boutique firms and at major social media companies, she contextualizes this hidden industry and examines the emotional toll it takes on its workers. This revealing investigation of the people “behind the screen” offers insights into not only the reality of our commercial internet but the future of globalized labor in the digital age.

Behind the Screen

Sara T. Roberts

Emergent Strategy

adrienne maree brown

Purchase Emergent Strategy: Shaping Change, Changing Worlds

Inspired by Octavia Butler's explorations of our human relationship to change, Emergent Strategy is radical self-help, society-help, and planet-help designed to shape the futures we want to live. Change is constant. The world is in a continual state of flux. It is a stream of ever-mutating, emergent patterns. Rather than steel ourselves against such change, this book invites us to feel, map, assess, and learn from the swirling patterns around us in order to better understand and influence them as they happen. This is a resolutely materialist “spirituality” based equally on science and science fiction, a visionary incantation to transform that which ultimately transforms us.

Purchase Artificial Knowing

Artificial Knowing challenges the masculine slant in the Artificial Intelligence (AI) view of the world. Alison Adam admirably fills the large gap in science and technology studies by showing us that gender bias is inscribed in AI-based computer systems. Her treatment of feminist epistemology, focusing on the ideas of the knowing subject, the nature of knowledge, rationality and language, are bound to make a significant and powerful contribution to AI studies.

Drawing from theories by Donna Haraway and Sherry Turkle, and using tools of feminist epistemology, Adam provides a sustained critique of AI which interestingly re-enforces many of the traditional criticisms of the AI project. Artificial Knowing is an esential read for those interested in gender studies, science and technology studies, and philosophical debates in AI.

Artificial Knowing

Alison Adam

Purchase How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics

In this age of DNA computers and artificial intelligence, information is becoming disembodied even as the "bodies" that once carried it vanish into virtuality. While some marvel at these changes, envisioning consciousness downloaded into a computer or humans "beamed" Star Trek-style, others view them with horror, seeing monsters brooding in the machines. In How We Became Posthuman, N. Katherine Hayles separates hype from fact, investigating the fate of embodiment in an information age.

Hayles relates three interwoven stories: how information lost its body, that is, how it came to be conceptualized as an entity separate from the material forms that carry it; the cultural and technological construction of the cyborg; and the dismantling of the liberal humanist "subject" in cybernetic discourse, along with the emergence of the "posthuman."

Ranging widely across the history of technology, cultural studies, and literary criticism, Hayles shows what had to be erased, forgotten, and elided to conceive of information as a disembodied entity. Thus she moves from the post-World War II Macy Conferences on cybernetics to the 1952 novel Limbo by cybernetics aficionado Bernard Wolfe; from the concept of self-making to Philip K. Dick’s literary explorations of hallucination and reality; and from artificial life to postmodern novels exploring the implications of seeing humans as cybernetic systems.

Although becoming posthuman can be nightmarish, Hayles shows how it can also be liberating. From the birth of cybernetics to artificial life, How We Became Posthuman provides an indispensable account of how we arrived in our virtual age, and of where we might go from here.

How We Became Posthuman

N. Katherine Hayles

  • Social AI Design Tool (coming soon)
  • AI. Culture. Creativity. (coming soon)
  • Wekinator
  • Runway ML
  • ML5.JS
  • P5.JS
  • Processing

AI / Emerging Tech Resources

Art & Social Tech Organizations

Industry Communities

LA Shout Outs

Organizations We’re Inspired By

Women’s Center For Creative Work

Machine Project (no longer active)

Las Fotos Project

Inner-City Arts

Echo Park Film Center

AI Artists

Check out AI. Culture. Creativity.