Ladies + AI Summit 2.0: What Stayed With Me
The toughest a part of the Ladies + AI Summit 2.0 wasn’t deciding what to attend. It was accepting what I’d should miss.
The schedule was so packed it was inconceivable to do all the things “proper.” There have been too many periods I wished to attend, too many individuals I wished to speak to, and little or no room to breathe between them. At one level, I made a acutely aware option to step out.
I ended up sitting with two of my favourite individuals to play hooky with, Sunny Eaton and Lori Gonzalez, speaking as an alternative of listening. The dialog drifted, because it typically does at good conferences, from instruments to penalties. We circled round how techniques like ChatGPT complicate the concept of a “cheap expectation of privateness.” We deal with these instruments like personal conversations, though they aren’t. That hole—between how these techniques really feel and the way they really work—is the place a lot of the threat lives.
Nearly everybody else stayed put. The periods have been too good to overlook. And nonetheless, that dialog lingered. It was a reminder that even at a tightly programmed convention, a few of the most significant moments come from selecting the place to spend your consideration.
That pressure—between construction and spontaneity—outlined the weekend for me. In a method, it mirrored the bigger conversations we have been having about AI itself: how a lot to automate, when to pause, and the way to decide on intentionally within the face of overwhelming risk.
What Saved Exhibiting Up
Trying again on the schedule, it could be straightforward to explain the summit as a development from talks to workshops to hands-on constructing. However what stayed with me have been the questions that saved resurfacing.
One of many clearest throughlines was AI literacy—not fluency with instruments, however understanding. How these techniques behave. The place they fail. And the way a lot company we hand over after we use them. A number of talks traced turning factors: worry giving solution to curiosity, skepticism shifting into discernment. There was a shared recognition that opting out isn’t impartial. Literacy permits engagement to be intentional quite than reactive.
Because the day shifted from listening to constructing, the emphasis moved from instruments to workflows. Essentially the most attention-grabbing conversations weren’t about intelligent outputs. They have been about boundaries and judgment. Not simply what may be automated, however what must be.
Ethics confirmed up not as philosophy, however as apply—particularly round information high quality and provenance. “Bias in, bias out” wasn’t a slogan. It was a warning. The priority wasn’t solely what AI produces, however what we feed it: whose experiences are represented, which sources are trusted, and the way rapidly flawed assumptions scale as soon as embedded in a system.
That thread carried immediately into entry to justice. AI wasn’t framed as a magic repair. If something, there was a sober recognition that poorly designed techniques can widen gaps as simply as shut them. Entry to justice wasn’t a mission assertion. It was a design constraint.
Beneath all of it was governance—not as a future coverage query, however as one thing already underway. The individuals selecting distributors, setting inner requirements, and defining acceptable use are shaping the longer term in actual time. Governance defaults to whoever is within the room.
Taken collectively, the summit wasn’t about celebrating AI. It was about duty. About participating with expertise in ways in which maintain up over time.
What We’re Taking Dwelling
I didn’t go away with an inventory of instruments to attempt. I left with a clearer framework for approaching AI work.
Literacy comes earlier than leverage. Adoption is an organizational design drawback, not only a coaching difficulty. Ethics begins with inputs, not outputs. Entry to justice should be constructed into techniques from the start. And governance is already underway, whether or not we acknowledge it or not.
None of that’s flashy. However it’s foundational.
If there was a shift, it was this: transfer intentionally. Construct the capability to pause. Ask higher questions earlier than accelerating. The long-term influence of AI gained’t be decided by how briskly we transfer, however by how thoughtfully we do.
Subscribe
Get skilled insights and sensible ideas delivered to your inbox each week.
Ladies within the Loop
Which is why it feels vital to call one thing I’ve deliberately held till now: this was a convention centered on and led by ladies.
That mattered—not as branding, however as posture.
At many AI conferences, there’s a YOLO power: construct quick, deploy quicker, type out penalties later. The emphasis is on scale and upside, with threat handled as friction.
That wasn’t the posture right here.
As a substitute of “What can we construct?” the questions extra typically gave the impression of “What ought to we construct?” and “Who does this have an effect on?” There was consolation with uncertainty. Openness about tradeoffs. A willingness to confess what hadn’t labored.
Even the design selections mirrored that care. Audio system had walk-up songs. Dolly Parton’s 9 to five marked transitions. Periods have been labeled mini, midi, and maxi—not by hierarchy, however by scale. None of it felt gimmicky. It felt intentional. Human.
This wasn’t an absence of ambition. It was a unique form of ambition—one oriented towards sturdiness, influence, and belief.
Illustration didn’t simply change who was talking. It modified what felt value discussing.
Shaping What Will get Constructed
Cat Moon and her workforce at Vanderbilt Legislation created greater than a convention. They created an area that modeled a unique method of participating with AI—curious, accountable, and deliberate.
I left not feeling pressured to undertake extra instruments, however clearer concerning the duty that comes with adopting any of them. In a subject that usually rewards pace, this felt like a needed pause.
If that is the place AI conversations are headed—extra reflective, extra inclusive, extra trustworthy about tradeoffs—it’s a route value investing in.
The way forward for AI isn’t formed within the summary. It’s formed in moments and weekends like this one.
For this reason it issues who’s within the room when selections are made.
Subscribe
Get skilled insights and sensible ideas delivered to your inbox each week.