Sequentur Blog
Helping you stay ahead of IT challenges
Real-world IT knowledge from engineers solving problems every day.
Practical IT knowledge for businesses that can’t afford downtime
Smart AI, Safe Data: A Field Guide for Collaborative Professionals
At Sequentur, we spend a lot of time helping clients adopt new technology the right wayespecially when compliance, privacy, and human nuance are in play.
So when I had the opportunity to co-present at the Florida Academy of Collaborative Professionals (FACP) annual conference with John Bridges of Cognivise.AI, we didnt just demo some flashy AI tools. We tackled the technical realities of bringing AI into a sensitive, high-stakes professional environmentone where the workflows are personal, the data is protected, and the users are not exactly early adopters.
And to prove our point? We let AI brainstorm options for cat custody. (More on that in a moment.)
The Use Case: AI That Enhances Your Collaborative Practice
Collaborative Law is designed to reduce conflict, encourage empathy, and create solutions that support the emotional and financial well-being of families. Thats not something you automate. But it is something AI can support.
We showed how AI can be used to:
- Generate multiple versions of parenting plans or agreements
- Create anonymized training case summaries
- Capture meeting minutes and auto-generate follow-up task lists
- Draft newsletters or outreach materials
- Accelerate intake workflows and internal documentation
And we didnt stop at hypotheticalswe demonstrated it live. One prompt we used: What are some creative ways divorcing couples can share custody of a cat? The LLM generated thoughtful, creative (and sometimes hilarious) responses, ranging from alternating weekends to letting the cat decide. The point? AI can support emotionally intelligent brainstorming, not just churn out sterile legalese.
But What About Compliance?
This is where most conversations around AI adoption in law, healthcare, and financial services tend to die out. And understandably so. We focused on how to keep experimentation safe while building for long-term compliance.
We broke down:
- FIPA (Florida Information Protection Act) obligations for data breach notification and storage of personal information
- HIPAA implications when working with PHI in client files, including the very real limitations of public LLMs like ChatGPT and Claude
- The critical need for Business Associate Agreements (BAAs) when integrating AI-powered tools from vendors like Microsoft
The takeaway: BAAs alone dont equal compliance. You need to configure your systems properly, document your risk management process, and ensure users understand what not to do.
Public vs Private AI Models: Infrastructure Matters
Heres where we leaned into the IT architecture side.
Public tools like ChatGPT and Google Gemini are great for soft usesbrainstorming, email drafts, internal content creationbut they shouldnt touch client data. Ever.
For real use inside collaborative law firms, we recommend private or hybrid deployments:
- Microsoft 365 Copilot with compliance configurations
- Azure-hosted LLMs with role-based access, DLP, and endpoint security controls
- On-prem LLMs (e.g., LLaMA or Mistral) for firms that need full data sovereignty or operate behind firewalls
Each model has trade-offs:
- Public AI is fast, cheap, and flexiblebut poses significant privacy risks.
- Private AI offers full control, auditability, and securitybut requires serious IT overhead, maintenance, and capacity planning.
At Sequentur, we help clients evaluate which route makes sense, taking into account compliance, technical capability, and budget. For many firms, securely integrating Microsoft Copilot into their M365 ecosystem is the right balance of power and protectionas long as its configured correctly.
Configuration Checklist: Microsoft 365 + AI
If youre planning to deploy Copilot or use AI-enhanced tools in your Microsoft 365 environment, here are the minimum security baselines we recommend:
- Sign a BAA with Microsoft
- Enforce Multi-Factor Authentication (MFA) org-wide
- Enable Data Loss Prevention (DLP) policies specifically for PHI and PII
- Configure Microsoft Purview Audit Logging
- Use Sensitivity Labels and Message Encryption
- Limit External Sharing across OneDrive, SharePoint, and Teams
- Implement Role-Based Access Control (RBAC) for document access
- Deploy Defender for Office 365 for phishing/malware protection
- Set Retention and Deletion Policies for compliance with state and federal laws
- Use Compliance Manager and Secure Score to continuously assess risk posture
If any of this isnt already part of your IT operations playbook, it needs to be.
Best Practices for Responsible AI Use in Client Workflows
We closed the session with a few universal principles for any professional experimenting with AI:
- Dont upload PII/PHI into public tools.
- Validate everything AI outputs before using it.
- Keep human review in the loop.
- Start smallthen scale intentionally.
Many tools in your current stack (like Microsoft 365, Zoom, Slack, Smokeball, Clio) are quietly rolling out AI features. If youre not reviewing those settings and assessing how data flows through them, you may already be exposed.
So What About That Cat?
The AI-generated cat custody brainstorm wasnt just comic relief. It highlighted the real value of AI in emotionally nuanced situations. It gave us:
- Legally plausible outcomes
- Thoughtful considerations (e.g., animal stress, childrens attachment)
- A wide menu of optionssome standard, some creative
Thats what good AI-assisted practice looks like: Human-first, insight-driven, and always reviewed through a professional lens.
One of the attendees summed it up best
This was a great primer into the power of AI to positively impact all our workflows. For the AI uninitiated, it provided a base knowledge of the capabilities and pitfalls of this new technology. I just wish we had more time.
’ Rick Rhodes, CPA
The Big Takeaway from FACP Conference
Collaborative law is built on a foundation of trust, care, and emotional intelligence. AI doesnt replace thatit supports it. But only if implemented securely, responsibly, and with IT expertise that understands both the tech and the context.
Thats what we bring to the table at Sequentur. Whether youre exploring Microsoft Copilot, planning a private LLM deployment, or just trying to get your head around whats safe to use in your firmweve got your back.
Get the Best IT Support
Schedule a 15-minute call to see if we’re the right partner for your success.
Testimonials
What Our Clients Say
Here is why you are going to love working with Sequentur