Skip to main content
METHODOLOGY

How do I get my team to actually use the AI tools I paid for?

Adoption fails when the tool sits separate from where the team already works. The fix is a 90-minute hands-on workshop where each person ships one real Skill that solves one of their tasks. By the end of the room, the system is being used, not promised.

AI adoption fails when the tool sits separate from where the team already works. I run a 90-minute hands-on workshop where each person ships one real Skill that solves one of their tasks. By the end of the room, the system is being used, not promised. Then I stay on retainer to keep building.

The line that explains why most AI rollouts fail

A partner at a Brickell boutique law firm with the dead-tool problem told me the line that explains it. "We spent money on a tool two years ago, nobody used it, and that still stings." A senior partner at a $14B wealth advisory firm said the same thing in different words: "The difference between Microsoft Copilot and Claude is huge." They had paid for Copilot. Nobody used it. The tool was bought, not adopted. The same pattern shows up in wealth firms with Microsoft Copilot regret where the license sits unused on every desk.

A company buys a license, sends out a launch email, schedules a one-hour training, and then expects behavior to change. It does not.

The tool lives in a separate browser tab. The team's actual work lives in email, files, WhatsApp, and spreadsheets. The gap never closes.

WHAT I DO INSTEAD

The 90-minute workshop

The first session is 90 minutes, hands-on, with the people who will actually use it in the room. We do not look at slides. We do not watch a demo. We do not run through capabilities.

Each person picks one task they did this week that ate too much time. We build a Skill that handles that task. By the end of 90 minutes, every person in the room has a working tool tied to a real workflow. They do not have to remember to use it next week, because they already used it today.

Mapi at The Joy of Impact wrote me a message after her workshop:

"Acabo de pasar... ah, tan lindo. No hubieramos podido hacer esto sin ti."

She and her VA in Colombia walked out with six Skills installed and the operating file connected to her Drive. Three weeks later she was referring me to her own coach.

WHY THIS WORKS

Why this works when launch emails and one-hour trainings don't

Three reasons.

One. The tool is built into the workflow, not next to it.

A finance analyst's first Skill drafts the variance commentary she writes every Monday morning. A construction PM's first Skill summarizes the foreman's WhatsApp report into the project update he sends every evening. The Skill replaces real work the person was already doing. They don't have to remember the new tool, because the new tool finishes a task they would have started anyway.

Two. Each person owns their first win.

The session is hands-on, not passive. Each attendee builds something themselves with my help. They saw it work on their actual data, with their actual context. The psychological barrier ("I don't know how to use this") is gone before they leave the room.

Three. The retainer keeps the system alive.

New workflows appear every quarter. Without ongoing Skill creation and CLAUDE.md updates, the system goes stale within 6 to 12 months and the tool dies a second death. Sami at Almaga ships new Skills weekly. A solopreneur psychologist on the lower tier ships them quarterly. The retainer flexes to what the team actually uses, and the workshop-and-retainer engagement is how this stays alive past month four.

WHAT GETS INSTALLED

What gets installed in the workshop

For a 5-person team, the 90-minute session typically produces:

  • 5 working Skills, one per person, tied to real weekly tasks
  • The first version of CLAUDE.md, with the company's context, voice, and key rules
  • Connections to whatever sources the team's data lives in (Drive, Outlook, sometimes WhatsApp Business)
  • A working agreement: who edits the CLAUDE.md, how new Skills get requested, what the monthly cadence looks like

A 10 to 20-person team gets a half-day version. A 50+ team gets multiple sessions broken by department. The shape is the same: each person ships one thing, and the team walks out with a system built on the Skill plus CLAUDE.md architecture, not a promise.

MODEL UPGRADES

What model upgrades have to do with this

Anthropic shipped Opus 4.7 in April 2026. Newer models follow instructions more literally than older ones. That is good news for teams with a clean CLAUDE.md and well-written Skills, because the same operating file produces better output on the new model. It is bad news for teams running unstructured prompts, because sloppy prompts break harder on more literal models.

The workshop-first method protects the team from this. The Skill format and the CLAUDE.md architecture get better with each model release, not worse. That is the part most agencies miss when they hand a team a license and walk away.

WHAT YOU OWN

What you actually own at the end

Every Skill from the workshop sits in your Claude account. The CLAUDE.md operating file lives in your Drive. The integrations connect to your accounts. You own everything we build. If you decide tomorrow that you no longer need a Fractional Head of AI, the system stays running because nothing about it depends on me being around.

If your team has a dead tool sitting in a browser tab right now, that is the conversation to have. Tell me which tool died and why, and I will tell you whether a workshop-first restart is worth running.

Adoption Questions

Frequently Asked Questions

The tool sits separate from where the team already works. A license gets bought, an email gets sent, a one-hour training gets scheduled, and then everyone goes back to their actual workflows in email, files, and spreadsheets. The gap never closes.

A 90-minute hands-on session with the people who will actually use the tool in the room. No slides, no demo. Each person picks one task they did this week and we build a Skill that handles it. They walk out using the system.

Typical training is passive: someone explains capabilities, the team watches. Workshop-first is active: each attendee ships a Skill on their own data with their own context. The psychological barrier disappears before they leave. They do not have to remember the new tool because they already used it.

90 minutes for a 5-person team. Half a day for 10 to 20 people. Multiple sessions broken by department for 50+ teams. The shape is the same: each person ships one Skill tied to a real weekly task, and the team walks out with a system.

Typically 5 working Skills (one per person), the first version of CLAUDE.md with company context and rules, connections to where the team's data lives (Drive, Outlook, sometimes WhatsApp), and a working agreement on who maintains the system going forward.

Because nothing maintains them. Initial setup ships, then new workflows appear, the CLAUDE.md goes stale, the Skills no longer match what the team does, and the tool dies a second death. The retainer prevents this by adding Skills as workflows appear.

Yes. Mapi at Joy of Impact had a 90-minute session with herself and her VA. They walked out with 6 Skills, a working CLAUDE.md, and connections to Drive. Same architecture as a 50-person team, just compressed. Three weeks later she was referring me to her coach.

Resistance usually traces back to a previous tool that died on them. The workshop-first method addresses that directly: instead of explaining why this tool is different, each person ships a working Skill on their own data in 90 minutes. The proof is in the room, not in slides.

New Skills as new workflows appear. CLAUDE.md updates as the business changes. Monthly working sessions to triage what's working and what isn't. The retainer is justified by visible output (Skills shipped, files updated), not by the calendar. Cancel anytime if the output stops compounding.

A reusable instruction set Claude loads on demand. A Skill for drafting Monday variance commentary. A Skill for summarizing a foreman's WhatsApp report. A Skill for writing the proposal section the team copies and pastes. The technical part of the system most teams cannot build alone.

If your team has a dead tool sitting in a browser tab right now, that is the conversation to have. Tell me which tool died and why.

I will tell you whether a workshop-first restart is worth running.