As lawyers, confidentiality isn’t just important—it’s foundational. Yet as we increasingly integrate generative AI and other tech tools into our practice, maintaining that confidentiality demands attention. Let’s walk through exactly what you need to know to keep client data secure, even as you harness powerful new technologies. 1️⃣ Know Where Your Data Goes Before using any AI-powered tool, clarify how your data is stored, who can access it, and whether it’s shared with third-party services. 2️⃣ Evaluate Data Security Practices Robust data security is non-negotiable. Always verify a vendor’s data security certifications, encryption standards, and access controls. 3️⃣ Limit and Control Your Data Inputs Use only the data you truly need to provide. The more data you input, the higher the confidentiality risk. 4️⃣ Use Built-in Privacy Controls Many reputable AI tools offer privacy modes or confidential environments that ensure your inputs won’t be used for model training or seen by unauthorized personnel. 5️⃣ Regularly Audit and Review Integrating technology is never a “set it and forget it” scenario. Regularly review and audit your chosen tools and their privacy compliance. - I’m Joe Regalia, a law professor and legal writing trainer. Follow me and tap the 🔔 so you won't miss any posts.
Protecting Confidentiality
Explore top LinkedIn content from expert professionals.
Summary
Protecting confidentiality means keeping sensitive information private and secure, whether it's client data, workplace complaints, or photos shared online. In today’s digital world, safeguarding this information requires awareness and careful practices, especially when using technology or sharing content publicly.
- Review data sharing: Before uploading information to AI tools or online platforms, always check where your data goes, who can access it, and how long it will be retained.
- Practice mindful sharing: Avoid posting identifiable client or participant images online; use blurred faces or creative angles instead to respect privacy and uphold trust.
- Document your actions: If disclosing confidential matters at work, keep written records and follow up conversations in writing, since details can be shared beyond your initial audience.
-
-
Be careful where you put your data. It’s tempting to drop confidential data into open AI tools for quick analysis or content generation. But here’s why that’s dangerous: • Open LLMs store prompts and responses. Your data could be retained, reviewed, or used for model training. • Data privacy and compliance risk. HIPAA, PCI, and internal confidentiality policies can be violated instantly. • Competitive exposure. You wouldn’t hand internal strategy decks to a stranger – treat open LLMs the same. How to do this securely: • Use enterprise AI platforms with private, encrypted deployments • Ensure no data retention policies are in place • Leverage local LLM models or cloud-based models within your secure environment • Consult your CISO or data privacy team before using generative AI with proprietary information We deploy AI for clients only in controlled, secure environments to protect their IP and customer data while delivering the efficiency gains AI offers. Don’t trade security for speed. If you want to implement AI safely within your organization, let’s connect. OTG Consulting #AI #DataPrivacy #Security #LLM #AIImplementation #Cybersecurity
-
Last week, an employee came to me after reporting her manager for harassment - trusting HR to keep it confidential. Instead, HR passed the complaint to the manager's boss, who told the manager everything. By week's end, her manager had turned her words into a threat - and her job into a target. Here's what being a former corporate counsel taught me about HR's quietest, biggest lie: When HR says "This conversation is confidential," they mean: "Everything you say will be documented, distributed, and potentially used against you.” I've sat in those meetings. I've seen the reports. I've watched the aftermath. The truth? Your "confidential" conversation gets shared with: 1. Your direct manager 2. Their manager 3. Legal department 4. Executive team 5. Anyone deemed "relevant" to the investigation But it gets worse. Remember those "performance issues" that suddenly appeared after your complaint? That's because HR took your vulnerable moments and reframed them as evidence: "She admitted feeling anxious" becomes "Unable to handle workplace pressure" "He mentioned being distracted" turns into "Lack of focus and productivity" "They expressed concerns about the team" transforms to "Not a cultural fit" I've watched this playbook destroy careers for years. Now I'm helping employees protect themselves. Three rules I want you to remember: 1. Document everything BEFORE going to HR 2. Assume every word will be shared 3. Get things in writing - after any verbal conversation, send a follow-up: "As discussed today…" Protect yourself first. The company already has an entire department doing the same. Follow for more insider insights on protecting your workplace rights. #EmploymentAttorney #CaliforniaEmploymentLaw #EmployeeRights
-
Dear students, It’s truly heartening to see so many of you sharing your internship experiences; celebrating your growth, skills, and the joy of working with clients and participants. However, I want to be honest about something that’s been weighing heavily on me. Every time I see internship photos on LinkedIn or Instagram - images of clients or participants with visible faces, often shared without careful thought - I feel genuinely heartbroken. It’s not just a lapse in judgment; it’s a breach of professional ethics. In our rush to showcase our work, we forget what truly matters: trust, respect, and confidentiality. Portraying our work in a way that compromises these principles undermines the very ethos we stand for. Colleges have a duty to teach students what can and cannot be shared on social media and to instil ethical practice from the outset. Sharing photos of clients or participants - whether they’re holding balloons or making boats - must be done with explicit consent and full awareness of how these images will be used and who will see them. Let’s be mindful. Let’s be ethical. Our professionalism extends beyond just the work we do; it includes how we respect and protect those we aim to help. Confidentiality is the cornerstone of our profession. When we post photos with visible faces - regardless of permission - we are breaching trust because by showcasing their faces, we are using their images to boost engagement, increase visibility, and elevate our presence on social media. We are prioritising our gains over their dignity and privacy. Do they understand that their faces are being used as tools to promote us? Are they aware their images could be shared widely or exploited? This is a serious breach of respect and ethics. It robs them of their dignity and reduces their participation to a mere commodity. Here’s what I suggest: take photos from angles where faces aren’t visible or blur faces before sharing. These small steps show respect and uphold dignity. Remember, social media algorithms may push for more likes, but our profession isn’t about chasing views - it’s about integrity, compassion, and standing by our core values. Let’s be responsible, respectful, and hold ourselves to the highest standards because their dignity and the integrity of our profession depend on it. #psychology #therapy #mentalhealth #internships
-
AI Knows Patterns, Not Secrets: What Company Secretaries Need to Understand There is still a great deal of misunderstanding about how AI tools like ChatGPT actually function, particularly when it comes to confidentiality in the governance space. I often hear concerns along the lines of, "If I upload minutes into an AI platform, could someone else later access that information?" The answer, quite simply, is no. These systems do not store or cross-share your content in a way that makes it available to other users. You cannot log in and retrieve someone else’s board papers, nor can anyone access yours through general AI queries. The information is not pooled into some giant searchable governance database. Conversations are isolated. But that does not mean the risk disappears. In fact, this is where governance professionals, and Company Secretaries in particular, need to approach AI with precisely the skillset we already bring to the table, a strong sense of structure, process, and duty of care. The real risks are more nuanced, and they sit in areas governance professionals understand well: Data retention policies: Where does your data reside? For how long? Who has administrative access to the AI system? Jurisdictional issues: In which country is your data stored? Are you complying with your local privacy, corporate, and market regulations? Third-party providers: Do you know how your AI provider secures and governs data behind the scenes? Cybersecurity: What happens if the provider experiences a breach? Could sensitive board information be exposed indirectly? The danger is not that your board pack will suddenly appear in someone else's AI chat. The danger is that you may be placing highly sensitive corporate information into systems where you no longer have full visibility or contractual control over its custody. This is where the governance profession needs to take ownership. AI is not inherently risky. Poor data governance is. As AI tools become more embedded in governance workflows, Company Secretaries have an opportunity to lead the conversation about how to use these tools both responsibly and effectively. Efficiency must never come at the cost of fiduciary responsibility. #Governance #CompanySecretary #AIinGovernance #DataGovernance #Confidentiality #DirectorsDuties #BoardSupport #GovernanceProfessionals #CorporateGovernance #AICompliance #BoardRisk
-
I’ve been closely following the news this week about a high-profile communications mishap where a private group chat among senior government officials inadvertently included a journalist. In an instant, a confidential conversation turned public. No encryption protocol or secure application can fully counteract human error. While robust encryption is vital, it cannot prevent the accidental addition of unintended recipients. The weakest link in any secure system is often not the technology itself, but the human element, and even seasoned leaders can make mistakes under pressure. Let’s face it: we’ve all hit “send,” only to realize too late that someone on the thread shouldn’t have been included. These moments are human. In high-stakes environments, though, the cost of a mistake is far greater. We should treat this as a learning moment about the importance of strong communication safeguards. As a leader, the ability to communicate and share information with confidence and transparency is critical, and yet that has never been riskier than it is today. Internal communications, the lifeblood of any organization, are often the hardest to secure and present daily risk. We must ask ourselves: are we doing everything we can to ensure our tools match the sensitivity of our conversations? This is a challenge I think about every day — it’s the reason I founded EchoMark. Our mission is to enhance trust, accountability, and confidentiality in digital communication. We believe leaders at the highest levels, from the board room to the Situation Room, deserve tools that don’t force a choice between convenience and security. By preserving the ease of open dialogue while fortifying privacy, we aim to set a new standard for secure communication. We need communication solutions that anticipate mistakes instead of reacting to them. Security features should be intuitive and proactive. This means rethinking protocols so confidentiality is never left to chance. Verifying participants before a discussion, preventing unauthorized forwarding or screenshots, and embedding digital “fingerprints” in content can all enforce accountability without hindering collaboration. At EchoMark, we’ve built technology that does just that by embedding invisible, traceable watermarks into digital content so that if sensitive information is ever leaked, it’s instantly clear where it came from. It’s a subtle but powerful way to deter leaks and create a culture of accountability without disrupting the speed or ease of communication leaders rely on. This incident is a reminder that we can and must do better. Together, let’s champion human-centric communication practices and technologies that match the sensitivity of our conversations. With foresight and innovation, we can ensure that when leaders speak in confidence, they do so with the assurance that their words will remain private. #Cybersecurity #AI #DataProtection www.echomark.com
-
Your contract ended. Your risk didn’t. Know how? There’s a thing called "Post-Termination" obligations. For bigger projects, risks remain even after the project ends. And this is what many new founders overlook. Normally, contracts end. But does your confidentiality obligation end with them? What about: • Indemnity? • IP rights? Most people assume these protections stick around. They don’t. Not unless you make it clear with a survival clause. The most common situation I can share with you is this: Imagine you’re a software developer. You just wrapped up a big project for a fintech client. The contract ends. Two months later, you discover: • They’re leaking sensitive code you shared. • You demand action. Their lawyer points to a loophole: • “No survival clause.” • “Your confidentiality obligation ended when the contract did.” Now your hands are tied. This isn’t hypothetical. It happens. Why? Because people underestimate what happens after a contract ends. So there's two risks you have to look out for: 1. Loss of Protection Without a survival clause, your confidentiality agreement vanishes with the contract. 2. IP Disputes Failing to extend IP rights can leave you fighting for control over your own work. But how can you get it right? Two main ways I suggest are. a) Identify Key Clauses Confidentiality, IP rights, liability, and indemnity are usually critical. b) Specify a Duration Confidentiality for 5 years? Indemnity forever? Be precise. So remember that contracts end. Risks don’t. A survival clause keeps your most important protections alive. Even when the contract is long gone. —— 📌 If you need Contracts that consider the risks that carry over after a Contract ends, then DM me "CONTRACT".
-
Directors (and officers) have a legal duty not to misuse information they receive in the course of their duties. That obligation continues to exist, even if the company doesn't. How confident are you that every director on your board is aware of their obligations and will maintain confidentiality at all times? Have you checked that you have passed on that requirement in any consulting or advisory agreements you enter into? Confidentiality should not be absolute. There may be situations where disclosure is necessary, such as to comply with legal obligations or protect the safety of individuals. In such cases, it's essential for the board to have clear policies and procedures in place for handling confidential information and determining when disclosure is appropriate. Are you confident that you have the right policies to protect you if disclosures are required? It is a standard expectation that board business will be kept confidential. This includes not talking about discussions or decisions of the board with people who are not on the board even if they are your nominator to the board. When confidentiality is breached, it is usually a civil matter, and the only form of redress may be recompense for damages. When those damages are measured in reduced reputation or embarrassment, there may be little that the board can do to impose a penalty on the transgressor (or to discourage potential transgressions). Here are some examples of how keeping things confidential helps boards govern better: · Trust: Confidentiality fosters an environment of trust among board members. When discussions are kept confidential, directors feel more comfortable sharing sensitive information without fear of it being disclosed improperly. · Legal Protection: By maintaining confidentiality, the board can mitigate the risk of defamation lawsuits. If defamatory information were to be leaked to the public, it could result in legal consequences for the organization and its members. Confidentiality helps prevent such leaks. · Open Dialogue: Board members need to have open and honest discussions to effectively address issues facing the organization. Confidentiality allows for these discussions to take place without fear of public backlash or repercussions. · Protecting Reputations: Confidentiality helps protect the reputations of individuals involved in the discussions. Without confidentiality, false or damaging information could be spread, tarnishing the reputations of both the organization and its members. · Encouraging Transparency: Paradoxically, confidentiality can encourage transparency within the boardroom. Knowing that discussions will be kept confidential allows members to speak freely, share diverse perspectives, and explore potential solutions without fear of judgment or reprisal. Do you need to have a conversation about confidentiality or conflicts of interest? #boards #directors #confidential #governance
-
Renewing vendor contracts? Don't forget to add in AI terms! Your vendors have likely incorporated AI somewhere in their products since the last time you reviewed their agreement. Make sure to include these terms— 1️⃣ Data Protection: Ensure vendors can't use your data to train AI models or share it with others. 2️⃣ Confidentiality: Include clauses that guarantee your data stays secure and is not used for other clients. 3️⃣ Monitor Scope Creep: Require vendors to notify you about any AI-related updates or new features added post-contract. 4️⃣ Audit Logging: Secure the right to review data handling practices, especially for sensitive information. 5️⃣ Retention Policies: Define clear guidelines for how long data can be retained by the vendor, aligning with your internal standards. Here's some vendor MSAs for inspirations in the comments 👇
-
In August, a Nashville man was indicted for running a "laptop farm." He allegedly convinced companies to hire him as a remote worker but instead of doing the work, downloaded and installed software on company computers that granted access to foreign bad actors posing as workers, breaching company security and funneling money abroad. This may sound like an outlandish story, but easy access to AI-generated audio and video heighten the risk of employee impersonation. Ways for companies to protect against employee impersonation: Before hiring: • Running background checks (and following state/local notice and disclosure requirements) • Vetting educational and employment background • Using secure methods for checking identity and work authorization. Especially for sensitive roles that are fully remote, consider flying the candidate out to meet in person or hiring a vendor who can vet their identity in person. • Requiring employees to sign robust confidentiality agreements During employment • Working with IT/InfoSec to develop best practices for securing company data • Monitoring employee login patterns and downloads • Developing protocols for exchanging money and sensitive information (for example, requiring multiple points of verification) • Even if you don’t regularly work on video, doing this occasionally. • Training managers to keep an eye out for suspicious activity After employment • Reminding employees of their confidentiality obligations • Securing company data immediately upon separation and monitoring use when employees give notice of resignation • Reviewing hardware that is returned and properly wipe equipment What else?
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Healthcare
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development