At Sequentur, we spend a lot of time helping clients adopt new technology the right way—especially when compliance, privacy, and human nuance are in play.
So when I had the opportunity to co-present at the Florida Academy of Collaborative Professionals’ (FACP) annual conference with John Bridges of Cognivise.AI, we didn’t just demo some flashy AI tools. We tackled the technical realities of bringing AI into a sensitive, high-stakes professional environment—one where the workflows are personal, the data is protected, and the users are not exactly early adopters.
And to prove our point? We let AI brainstorm options for cat custody. (More on that in a moment.)
The Use Case: AI That Enhances Your Collaborative Practice
Collaborative Law is designed to reduce conflict, encourage empathy, and create solutions that support the emotional and financial well-being of families. That’s not something you automate. But it is something AI can support.
We showed how AI can be used to:
- Generate multiple versions of parenting plans or agreements
- Create anonymized training case summaries
- Capture meeting minutes and auto-generate follow-up task lists
- Draft newsletters or outreach materials
- Accelerate intake workflows and internal documentation
And we didn’t stop at hypotheticals—we demonstrated it live. One prompt we used: “What are some creative ways divorcing couples can share custody of a cat?” The LLM generated thoughtful, creative (and sometimes hilarious) responses, ranging from alternating weekends to letting the cat decide. The point? AI can support emotionally intelligent brainstorming, not just churn out sterile legalese.
But What About Compliance?
This is where most conversations around AI adoption in law, healthcare, and financial services tend to die out. And understandably so. We focused on how to keep experimentation safe while building for long-term compliance.
We broke down:
- FIPA (Florida Information Protection Act) obligations for data breach notification and storage of personal information
- HIPAA implications when working with PHI in client files, including the very real limitations of public LLMs like ChatGPT and Claude
- The critical need for Business Associate Agreements (BAAs) when integrating AI-powered tools from vendors like Microsoft
The takeaway: BAAs alone don’t equal compliance. You need to configure your systems properly, document your risk management process, and ensure users understand what not to do.
Public vs Private AI Models: Infrastructure Matters
Here’s where we leaned into the IT architecture side.
Public tools like ChatGPT and Google Gemini are great for soft uses—brainstorming, email drafts, internal content creation—but they shouldn’t touch client data. Ever.
For real use inside collaborative law firms, we recommend private or hybrid deployments:
- Microsoft 365 Copilot with compliance configurations
- Azure-hosted LLMs with role-based access, DLP, and endpoint security controls
- On-prem LLMs (e.g., LLaMA or Mistral) for firms that need full data sovereignty or operate behind firewalls
Each model has trade-offs:
- Public AI is fast, cheap, and flexible—but poses significant privacy risks.
- Private AI offers full control, auditability, and security—but requires serious IT overhead, maintenance, and capacity planning.
At Sequentur, we help clients evaluate which route makes sense, taking into account compliance, technical capability, and budget. For many firms, securely integrating Microsoft Copilot into their M365 ecosystem is the right balance of power and protection—as long as it’s configured correctly.
Configuration Checklist: Microsoft 365 + AI
If you’re planning to deploy Copilot or use AI-enhanced tools in your Microsoft 365 environment, here are the minimum security baselines we recommend:
- Sign a BAA with Microsoft
- Enforce Multi-Factor Authentication (MFA) org-wide
- Enable Data Loss Prevention (DLP) policies specifically for PHI and PII
- Configure Microsoft Purview Audit Logging
- Use Sensitivity Labels and Message Encryption
- Limit External Sharing across OneDrive, SharePoint, and Teams
- Implement Role-Based Access Control (RBAC) for document access
- Deploy Defender for Office 365 for phishing/malware protection
- Set Retention and Deletion Policies for compliance with state and federal laws
- Use Compliance Manager and Secure Score to continuously assess risk posture
If any of this isn’t already part of your IT operations playbook, it needs to be.
Best Practices for Responsible AI Use in Client Workflows
We closed the session with a few universal principles for any professional experimenting with AI:
- Don’t upload PII/PHI into public tools.
- Validate everything AI outputs before using it.
- Keep human review in the loop.
- Start small—then scale intentionally.
Many tools in your current stack (like Microsoft 365, Zoom, Slack, Smokeball, Clio) are quietly rolling out AI features. If you’re not reviewing those settings and assessing how data flows through them, you may already be exposed.
So… What About That Cat?
The AI-generated cat custody brainstorm wasn’t just comic relief. It highlighted the real value of AI in emotionally nuanced situations. It gave us:
- Legally plausible outcomes
- Thoughtful considerations (e.g., animal stress, children’s attachment)
- A wide menu of options—some standard, some creative
That’s what good AI-assisted practice looks like: Human-first, insight-driven, and always reviewed through a professional lens.
One of the attendees summed it up best
“This was a great primer into the power of AI to positively impact all our workflows. For the AI uninitiated, it provided a base knowledge of the capabilities and pitfalls of this new technology. I just wish we had more time.”
– Rick Rhodes, CPA
The Big Takeaway from FACP Conference
Collaborative law is built on a foundation of trust, care, and emotional intelligence. AI doesn’t replace that—it supports it. But only if implemented securely, responsibly, and with IT expertise that understands both the tech and the context.
That’s what we bring to the table at Sequentur. Whether you’re exploring Microsoft Copilot, planning a private LLM deployment, or just trying to get your head around what’s safe to use in your firm—we’ve got your back.