For practitioners advising companies that build, deploy, or invest in digital products and services, the regulatory environment has shifted significantly in a short period. The ubiquitous use of artificial intelligence in business, evolving data-protection regimes, and heightened scrutiny of children's online safety raises complex data ownership and compliance issues.
Key regulatory developments
Three areas in particular deserve close attention from in-house counsel, technology lawyers, and data-privacy and IP professionals right now:
Data sharing and cybersecurity obligations. The latest wave of product compliance requirements, including both U.S. state-level mandates and European regulations, affects how companies structure data flows across their platforms and services. For technology counsel, the challenge is twofold. On one hand, new cybersecurity obligations at the product level are raising the bar on what "secure by design" actually means in practice, from software supply-chain transparency to incident-reporting timelines. On the other, data-sharing arrangements that were once routine are now running into restrictions that differ by jurisdiction. Companies that depend on cross-border data flows to power AI-driven features or analytics capabilities should map requirements against their existing contracts and vendor relationships before an enforcement action forces the issue.
Use of data for AI training and related copyright considerations. The question of who owns what, and on what terms can data and content be used to train AI systems should be front of mind. The answers are far from settled. U.S. courts are working through fair use arguments in cases that could define the boundaries of permissible AI training for a generation, while the European Union has taken a more prescriptive approach through the EU AI Act and related copyright frameworks that impose distinct transparency and opt-out obligations. The practical gap between these regimes has consequences: licensing agreements for training data need to account for jurisdictional asymmetries in copyright treatment; due diligence in AI-related M&A and financing transactions should ask not just what data was used but how it was sourced, whether proper consents or licenses were obtained, and what open-source components sit in the training stack.
Children's protection in digital products and services. Age verification and youth safety obligations are becoming more of a compliance issue for any company whose products or services touch younger users, and the regulatory momentum shows no signs of slowing. In the United States, a patchwork of state-level age-appropriate design codes is creating compliance complexity for platforms that operate nationally, while the European Union continues to refine its approach under the Digital Services Act and related regulations. Practical challenges without easy answers include implementing age-verification mechanisms that satisfy regulators without undermining user experience or creating new data-privacy liabilities. For in-house teams, the key issue to spot is the interaction between youth-safety mandates and existing privacy frameworks. Age verification requires collecting sensitive data, which means companies need to think carefully about how verification workflows align with their broader data-minimization and consent obligations.
The cross-jurisdictional dimension
What makes these three issues especially difficult (and important to get right) is that none of them can be analyzed through a purely domestic lens anymore. The interplay between American and European regulatory regimes on data sharing, AI copyright, and children's safety is actively shaping deal structures, compliance programs, and enforcement risk. A licensing agreement negotiated solely with U.S. fair use principles in mind may be materially deficient under EU frameworks. A cybersecurity compliance program designed around a single jurisdiction's requirements may leave gaps that a regulator on the other side of the Atlantic will be quick to identify. For counsel advising technology companies with any cross-border footprint (which, in practice, means nearly all of them), fluency in both the U.S. and European approaches is no longer a specialty skill. It is a baseline expectation.
Further discussion
Join us: Sarah Bruno (Partner, San Francisco), Dr. Andreas Splittgerber (Partner, Munich), Christian Leuthner (Partner, Frankfurt), and Mitesh Patel (Associate, Silicon Valley) as we dig into these topics in an interactive, discussion-driven format with dedicated time for audience questions and real-world scenarios. The session will be a useful opportunity for any practitioners, as in-house counsel, investors, or outside advisors, to engage with these topics in a collaborative setting.
Date: Tuesday, April 28, 2026
Location: Reed Smith Silicon Valley - Event Center, 1899 Page Mill Rd, Palo Alto, CA 94304
Schedule:
8:30 a.m. – 9:00 a.m. PT | Registration and breakfast
9:00 a.m. – 10:00 a.m. PT | CLE presentation and discussion
Registration is required, and seating is limited. RSVP now!