What if AI Truly Served Humanity? Event Report
London, 11 June 2025
She Shapes AI Hosts Bold, Sold-Out Gathering in London
What If AI Truly Served Humanity? That was the guiding question behind a sold-out evening hosted by She Shapes AI, in collaboration with Deeply Human Innovation, at the historic People’s Mission Hall in Whitechapel. With support from 42 London, x+why, and the London Interdisciplinary School, the event brought together over 90 diverse participants from AI founders and policy specialists to creatives, civil servants, and educators for an immersive programme of keynotes, breakouts, and conversation.
The occasion also marked a milestone: She Shapes AI’s first birthday. Over the past year, the organisation has evolved from a bold idea into a growing global ecosystem championing women-led leadership in AI and reimagining the values that shape its future.
Lighting the Way: Keynotes from the Margins
Three lightning keynote speakers opened the evening with urgent calls for a more inclusive, human-rooted future.
Anna Mae Yu Lamentillo, Founder of NightOwl GPT and a She Shapes AI Award winner, issued a powerful reminder that “language is power.” While over 6,900 languages are spoken worldwide, only 34 are digitally thriving, shutting out billions from meaningful interaction with AI. She called for models built on inclusive data, stewarded by the communities they aim to serve, insisting that language equity is not a luxury but a matter of dignity and digital sovereignty.
Dr Stella Pachidi of King’s College London exposed the disconnect between those designing AI tools and those meant to use them. She shared how the loss of tacit knowledge and real-world context leads to systems that are efficient but alienating. Her keynote called for co-design, interdisciplinary collaboration, and the recognition that AI is only ever as ethical as the systems and people shaping it.
Dr Shikoh Gitau, CEO of Qhala, offered a radical reframe: what if AI were built by and for African women? From Atunzi, a digital “auntie” supporting care work, to Hakii AI, a legal tool serving incarcerated women in local languages, she presented not theory but application. Her message was clear: when communities build for themselves, justice isn’t a promise. It’s the product.
Building Together: Breakouts for Imagination and Action
Following the keynotes, attendees split into six facilitated breakout sessions, spaces designed not just for dialogue but for shared authorship of what comes next.
Facilitated by Lucia Asanache, Renaissance Philanthropy & Rachel Gutierrez, She Shapes AI, “Building Ecosystems for Systemic Change,” wrestled with the financial and structural barriers impeding ethical AI. From the rigid demands of traditional investors to the absence of shared language around “responsibility,” the session exposed the ecosystem gaps that keep scalable solutions from taking root. Yet the tone remained hopeful, as participants envisioned new models, like citizen-led AI testing and intuitive public standards, that place human needs above market pressures. The session affirmed that reimagining AI means reshaping the systems that fund, regulate, and scale it.
In “Closing the Skills Gap,” facilitated by Tania Revun of DeepMind, participants explored how AI is reshaping learning, creativity, and critical thinking. Many voiced concerns about over-reliance on tools and the erosion of problem-solving skills, while others highlighted the need for adaptable, lifelong learning frameworks. The conversation spanned classrooms and workplaces, questioning how to prepare people not just to use AI, but to think with it. Across the group, there was agreement that future-ready skills must prioritise discernment, curiosity, and human connection over automation alone.
In “Designing New Measures of Success,” facilitated by Iliana Gross-Buening of Deeply Human Innovation, participants explored what it would take to measure AI’s impact in human terms. The group challenged the dominance of financial and engagement-based metrics, proposing alternatives rooted in trust, authenticity, and well-being. From social media to the third sector, they asked how we might track what truly matters like autonomy, connection, and the long-term effects of our tools. The session underscored that redefining success in tech requires not only new KPIs, but a cultural shift in what we value and prioritise.
In “The Power of Storytelling,” led by Eniko Tarkany-Szucs of LinkedIn, participants examined how AI is reshaping the stories we tell and who gets to tell them. The conversation ranged from AI’s creative potential to its dangers, including deepfakes, cultural erasure, and narrative manipulation. While some saw opportunity in AI-generated storytelling, others warned of homogenisation and the erosion of authenticity. The group reflected on what it means to trust a story shaped by machines, and how storytelling once a deeply human act may now need new ethical boundaries. At its core, the session asked whether AI can ever truly understand the emotional weight of a story, or if human connection must remain at the heart of how we communicate.
In “Designing AI for Human Empowerment,” co-facilitated by Kathryn Vickers with UK Civil Service and Hannah Beresford of BCG, participants reflected on what it means to protect and prioritise human experience in the age of intelligent systems. The group explored where AI supports them, streamlining admin, aiding communication, or enabling creativity, and where it falls short, such as reinforcing bias or flattening nuance. A key insight emerged: empowering design begins with honouring what makes us human: judgement, emotion, collaboration, and the ability to grow through challenge. In exploring accessibility to mental health, the session surfaced concrete ways AI could serve as a tool for care, clarity, and connection, rather than replacement.
In “Re-inventing Information with AI,” facilitated by Jenny Romano of The Newsroom and Laura Ellis of BBC R&D, participants explored how truth, trust, and control are being redefined in the information age. Dividing into five groups, Big Tech, Small Tech, Government, Civil Society, and Media, they examined tensions between innovation and responsibility. Conversations surfaced deep concern about AI’s role in distorting history, eroding media trust, and manipulating human behaviour. At the same time, there was hope: that friction in information flow might spark critical thinking, that small tech could resist extraction-driven models, and that governments could set guardrails without stifling innovation. Across sectors, the session affirmed the urgent need to rebuild information ecosystems that serve democratic purposes not just profit.
The Plenary: Reclaiming the Purpose of AI
As the evening drew to a close, participants reconvened for a dynamic panel discussion moderated by Iliana Gross-Buening, Co-Founder of Deeply Human. The question on the table: How do we actually build an AI future that serves humanity?
Mai Do, Responsible AI Specialist at Airbus, called for a socio-technical lens insisting that ethics isn’t a checklist but a cultural shift. She challenged policymakers to go beyond compliance and invest in public understanding and civic engagement.
Jessica Butcher MBE, Founder of the ScrollAware Foundation, spoke about the addictive scroll culture engineered by today’s platforms. The emotional and social fallout, polarisation, anxiety, and loss of attention, aren’t bugs, she reminded us. They are features. Designing for human flourishing requires rethinking business models at the root.
Daniel Stanley, Executive Director of Future Narratives Lab, pushed for new coalitions and collective storytelling. If the world we inherit is shaped by the stories we allow to dominate, he asked, what might we create if we chose different ones?
Hannah Foxwell, Founder of AI for the Rest of Us, shared how she built inclusive, jargon-free spaces for AI literacy, spaces that meet people where they are, rather than locking knowledge behind technical gatekeeping. Her journey underscored the value of learning communities built on accessibility, not abstraction.
Together, the panelists challenged the room to redefine metrics of success, invest in safety and sustainability, and centre the needs of those too often excluded from tech’s most powerful rooms.
Final Reflections: A Future We Help Shape
While many discussions focused on social justice and governance, several voices raised the overlooked link between AI and environmental sustainability. As AI systems grow in scale and energy use, climate considerations must become core to how we design and deploy them. As one participant noted, “Sustainability isn’t a limitation. It’s a design decision.”
As twilight fell over London, attendees gathered in the courtyard for drinks, reflection, and connection. The evening closed not with certainty, but with a shared call to action:
To demand transparency.
To ask better questions.
To reclaim our agency in shaping AI.
Because ultimately, AI does not belong only to engineers or executives. It belongs to all of us. And if we want it to truly serve humanity, we must have the courage to shape it together.
Interested in sponsoring a future event or collaborating? Reach out to us at contac[at]sheshapes[dot]ai