4 Steps to Avoiding a Digital Dumpster Fire

How Meeting Planners Can Draft Smarter Data Policies and Choose Safer AI Tools

As AI tools like ChatGPT become part of everyday workflows in the meetings and events industry, a recent court ruling has made one thing clear: data privacy is no longer something planners can take for granted. With a federal court ordering OpenAI to indefinitely retain all ChatGPT conversation logs—including those previously deleted, event professionals must now take a closer look at how they use AI, what information they’re entering, and whether their existing policies are still adequate.

Here’s how to update internal and client-facing policies—and make smart choices about which AI platforms to use.

Step 1: Revisit Your Data Handling Policy

Begin by reviewing the documents and agreements that govern how you manage client and attendee data. This includes your internal privacy policies, contracts, service-level agreements (SLAs), terms and conditions, and any standard operating procedures (SOPs) your organization follows. You’ll want to look closely to see if these materials mention the use of third-party tools such as ChatGPT or other AI platforms, or if there are assumptions built into your workflows about how data is stored, deleted, or processed.

Also consider whether you’ve made assurances to clients—explicitly or implicitly—that no data will be stored beyond the scope of your engagement. Many planners work with global clients or large corporations that are bound by data regulations like GDPR or CCPA. If your current tools or practices are not aligned with those standards, this new legal environment may expose you to compliance issues or contract violations.

Step 2: Add an “AI Use” Clause to Contracts and Proposals

As more event planning tasks are augmented by AI, it’s wise to clearly disclose this to clients in your contracts and proposals. A simple clause that outlines your use of AI can go a long way in building trust. You might explain that AI tools are occasionally used to assist with content creation, communication, or planning documentation. It’s important to be honest about the specific platforms you’re using and how they handle data.

Clients should also be given the opportunity to opt out of AI-assisted workflows if they prefer to keep everything entirely manual or confidential. This clause can clarify that no personally identifiable information, sensitive financials, or protected data will be entered into AI systems without express permission. The goal isn’t to scare clients—it’s to demonstrate transparency and responsible use.

FURTHER READING: PLANNERS BEWARE—A COURT ORDER SAYS YOUR AI BRAIN DUMP MAY NOW LIVE FOREVER

Step 3: Choose AI Tools That Respect Data Privacy

For those who want the benefits of AI while minimizing risk, it’s time to evaluate your tools. Some versions of ChatGPT offer significantly stronger privacy protections than others. For example, ChatGPT Enterprise is specifically designed for businesses and organizations that need high security. It does not retain any conversation data, and user inputs are never used to train the model. It also provides administrative controls and audit tools for added oversight.

Similarly, ChatGPT Edu—geared toward academic institutions—offers many of the same privacy benefits as the enterprise version. It’s a great option for planners working in education-focused settings or organizing university conferences.

Another privacy-forward option is the ChatGPT API with Zero Data Retention (ZDR). With this configuration, developers can build custom tools or applications that use AI capabilities without storing any user input. This is especially useful for event tech providers or planners building in-house tools for registration, scheduling, or matchmaking.

Beyond OpenAI’s ecosystem, there are also alternatives like Claude 3 with encryption options, or open-source models such as Llama 3, which can be run on private servers. These tools offer greater control for planners working with particularly sensitive events, such as those involving government clients, healthcare organizations, or high-net-worth individuals.

Step 4: Train Your Team (and Clients) on AI Best Practices

Of course, new tools and updated policies only work if your team knows how to implement them. This is the time to offer some light but essential training on AI best practices. Make sure your staff, freelancers, and third-party vendors understand what kind of information is appropriate to share with an AI assistant—and what should be kept offline.

They should be able to identify when a tool retains data and when it doesn’t, and they should be familiar with which AI platforms are approved for client work. Consider creating a one-page internal reference or client-friendly FAQ explaining your approach to AI. This can be a helpful onboarding tool, especially when working with corporate or government clients who need reassurance about data security.

AI is here to stay, and it’s changing how we work for the better. But planners who treat data privacy with the seriousness it deserves will be the ones who thrive. By updating your internal policies, clarifying your use of AI in contracts, choosing platforms with strong privacy standards, and training your team, you’ll future-proof your business in a world where deleted doesn’t always mean gone.

Any thoughts, opinions, or news? Please share them with me at vince@meetingsevents.com.

Photo by FlyD on Unsplash

Digital Tipping 101

Once upon a time, tipping while attending an offsite meeting or event was easy: slip a few bills to the bellman, leave something on the pillow for housekeeping, and hand off a discreet envelope at checkout. But as hotel and event services increasingly shift to digital and cashless systems, the simple act of saying “thank you” has gotten more complex.

Subscribe

* indicates required
MeetingsEvents.com