AI Governance

Legal strategy for startups building with and deploying AI. From EU AI Act compliance to IP ownership of AI-generated outputs.

AI has changed how software gets built. Founders are shipping production apps in days using tools like Claude Code, Cursor, and Copilot. But the legal infrastructure hasn't caught up, and most lawyers have never touched the tools their clients are building with.

I build web and iOS apps with Claude Code and Cursor myself. When I advise on AI governance, I'm not theorizing about risks I've read about in a white paper. I've encountered them in my own codebases. That firsthand experience is what makes the difference between generic legal advice and governance strategy that actually maps to how your team works.

What I Cover

EU AI Act Compliance

Risk classification analysis for your AI systems, conformity assessment preparation, transparency and documentation requirements, and ongoing compliance monitoring strategy. The Act applies to US startups with EU users.

AI Output Ownership & IP

Structuring IP ownership when your codebase is partially or fully AI-generated. Drafting MSAs, SOWs, and assignment clauses that account for AI-assisted deliverables and the evolving copyrightability landscape.

Copyright Risk in AI-Generated Code

Assessing and mitigating the risk that AI-generated code in your product may be substantially similar to copyrighted training data. Practical strategies beyond just "hope for the best."

Open-Source License Exposure

Evaluating copyleft contamination risk when AI tools trained on open-source repositories generate code for your proprietary product. Scanning strategies and internal policy development.

Internal AI Governance Policies

Developing acceptable use policies for AI tools across your organization. Which tools are approved for what data? How do you handle AI-generated outputs in customer-facing products?

Contract Modernization

Updating your MSAs, SOWs, Terms of Service, and Privacy Policies to reflect the reality that AI changes how products are built and delivered. Most commercial contracts were written for a pre-AI world.

Why AI Governance Matters Now

AI governance isn't a compliance checkbox. It's a competitive advantage. Enterprise customers are starting to ask about your AI practices before signing contracts. Investors are asking about IP ownership before writing checks. Regulators in the EU, and increasingly in the US, are building enforcement frameworks with real teeth.

The startups that get ahead of this don't treat AI governance as a burden. They treat it as a signal to the market that they're serious, that they understand the technology they're building with, and that they can be trusted with enterprise data and AI-powered decisions.

Getting this right early, when your team is small and your product is still taking shape, is dramatically easier and cheaper than retrofitting governance after you've shipped to hundreds of customers.

Frequently Asked Questions

What does an AI governance lawyer do for startups?

An AI governance lawyer helps you navigate the legal and regulatory landscape of building with AI. This includes EU AI Act compliance, IP ownership of AI-generated outputs, copyright risk in AI-assisted code, open-source license exposure, and developing internal policies for how your team uses AI tools. As AI becomes central to how products are built, governance is a prerequisite for enterprise sales, fundraising, and regulatory readiness.

Does the EU AI Act apply to US startups?

Yes. The EU AI Act has extraterritorial reach, similar to GDPR. If your AI system is used by or affects people in the EU, the Act likely applies regardless of where you're incorporated. Startups with any European user base need to understand their obligations under the Act's risk classification framework.

Who owns code written by AI tools like Claude Code or Cursor?

This is evolving. Purely AI-generated works may not be copyrightable under current US law (which requires human authorship), but most AI-assisted code involves significant human direction that likely qualifies. The more critical question is contractual: what do your AI tool provider's terms say, and how do your MSAs handle AI-assisted deliverables?

What open-source risks exist when using AI coding tools?

AI coding assistants trained on open-source repositories can generate code substantially similar to copyleft-licensed code (like GPL), which could impose obligations on your entire codebase. This requires proactive scanning, clear internal policies, and contract provisions that allocate IP risk appropriately.

Need AI governance guidance?

Whether you're navigating the EU AI Act, structuring IP ownership for AI-generated code, or building internal governance policies, let's talk.

Schedule a Consultation

This site uses cookies for live chat (Crisp). No analytics cookies. Privacy details