Welcome to a new episode of “A Little Privacy, Please!” We’re thrilled to introduce the newest partner on our Cybersecurity & Privacy team: Jacqueline (Jackie) Cooney.
With more than 30 years of experience as a cybersecurity and privacy attorney, Jackie helps clients establish cybersecurity programs, enhance their data protection strategies, and navigate data breach mitigation and response.
We invited Jackie to share more about her background, her approach to privacy compliance, and what she sees on the horizon for privacy and cybersecurity.
Talk to us about your journey. How did you get started in privacy law?
It really started when I was in my early twenties, working in the Senate during the birth of the internet. There weren't many rules about what data could be shared, stored, or sold online. My boss—the senator I worked for—became very interested in this, and that was my introduction to the field. Around that time, HIPAA also passed. It was the beginning of the government recognizing that personal information and privacy required protection.
From there, I worked at a variety of firms, both in consulting and legal, to develop a practice focused on helping companies create sustainable, flexible, and compliant frameworks. I’m thrilled to be here at Nixon Peabody. It’s a natural fit for me, the work I do, and the clients I serve.
When should companies contact a privacy compliance attorney?
To be clear, I am not a litigator. That said, I certainly have clients who call me when they’ve received a demand letter or been served with a claim, but my work is truly underpinned by regulatory compliance.
Clients should reach out when they have questions about a particular regulation—does it apply to us? How does it apply? How do we comply?
More importantly, my practice focuses on helping companies take a broader view of their data governance, privacy, and cybersecurity compliance. I help them understand where their data is, where it’s coming from, where it’s being transferred, and how they’re using it. Then we create compliant documentation, processes, and workflows to build a sustainable privacy program. That’s the bulk of what I do: digging into their programs; figuring out what needs fixing; and determining how to do it from an operational, practical perspective.
What are the biggest privacy and cybersecurity challenges facing companies today?
Great question. There’s always a “flavor of the month.” We’ve seen a lot of California Invasion of Privacy Act (CIPA) demand letters and lawsuits recently. Many clients come to me saying, “I’ve gotten this letter,” or “I’ve had a suit filed against me,” or “I’ve gotten a demand for arbitration. What should I do?” My job is helping clients figure out how they ended up there. What’s going on with your website that attracted someone looking to file a lawsuit?
Companies are taking a deeper look at their online presence—their websites, their ad tech ecosystems—to determine whether they’re doing what they need to do to comply with the law. Are they providing the right notices and consents? Are they setting proper expectations for consumers? That’s been a lot of my focus over the last couple of years, especially from a holistic perspective. Many companies I work with have affiliates and subsidiaries, several dozen websites, mobile apps, and other consumer touchpoints. That’s a big chunk of what clients have been worried about lately.
As far as what’s on the horizon: AI is the story of the hour. Clients are becoming really concerned from a couple of perspectives. One is regulatory. The landscape around AI regulations is shaky at the moment. Companies don’t know what laws apply to them now or what will apply, starting with the EU AI Act, but also here in the United States with executive orders and different states taking action.
What this really means for our clients is a governance issue. They’re trying to understand how they’re using data, when they should disclose that they’re using AI, and what protections should be placed on the AI they use. A couple of years ago, many clients wanted to create policies prohibiting employees from using AI. What they really meant was that they didn’t want any company or personal information entered into ChatGPT.
Going forward, AI is helping companies grow and innovate. Balancing the concerns and risks around AI, including regulatory risks and others, with actual governance, rules, and thoughtfulness about using AI is the work I see coming our way this year.
What type of privacy work do you enjoy most?
I really love doing full privacy program assessments and remediations. That honestly comes from my consulting background. It’s legal work, for sure—we’re talking about how to comply with laws—but so much of it relates to business operations. How do you integrate these rules and requirements without stymying business or getting in the way of profit?
So many clients hear from their in-house and external counsel, “No, you can’t do that.” We would never tell a client not to comply with the law, but we can help them think through a risk-based approach to implementing a program where they can use data comfortably, meet their business needs, and satisfy regulatory expectations. That’s what I really like to do.
I’m always happy to help clients with specific needs or questions, but I love the bigger projects—really digging in and getting to know the client.

