The Risks of AI Chatbots and How to Mitigate Them
Almost every external meeting I’ve joined in the past several weeks has begun with a request to approve or remove an AI chatbot.
We’re all excited about the capabilities and productivity improvements AI promises, but there are real risks to privacy and corporate secrets that we must consider before exposing data to AI. We tend to think about this principally as an internal issue—what tools will my employees use—but that approach doesn’t address the potential harms of external users.
One of my favorite daily use cases for AI is meeting notes, and plenty of vendors have rushed meeting note-takers to market. Microsoft offers meeting notes through Teams Premium, but many others offer bots that join meetings as discrete note-taking participants.
These participants may have the ability to record audio without consent, capture names and contact info for other participants, and capture data without necessarily having the requisite safeguards conferred by tools like Teams Premium, which can honor internally-defined sensitivity requirements. Some of this risk is realized as a threat to intellectual property, but there’s a huge risk to privacy, as well, and having these bots join your corporate meetings may cause you to fall afoul of privacy regulations. You do not want the EU beating down your door because a chatbot was added to a meeting.
In one meeting, the entire event was brought to a halt while the organizer conferred with his legal department over whether or not the bot could attend. 40 people from various orgs had to wait in silent limbo for 5 minutes waiting for a resolution, a disruptive affair that cost over 3 man-hours of lost time to support a decision for a tool intended to save time.
Microsoft Introduces a New Verification Mechanism for Teams Meetings
So, it was extremely welcome news to see Microsoft announce verification checks to join Teams meetings. Initially announced just after Ignite 2024 with CAPTCHA support, the program will now be extended to include options for OTP and email verification.
If your organization allows anonymous users to bypass the lobby, consider enabling the -CaptchaVerificationForMeetingJoin parameter in your CsTeamsMeetingPolicy configuration, and look for the new email OTP settings to arrive early in 2025. This should effectively eliminate anonymous bots and the risks to which they expose your meetings.
This move by Microsoft is a significant step forward in addressing the concerns many organizations have about the security and privacy implications of inviting AI bots into their virtual spaces. The introduction of verification checks is a proactive measure to ensure that only authorized participants can join meetings, thus safeguarding sensitive information from potential breaches.
Verification mechanisms like CAPTCHA, OTP, and email verification add layers of security that help authenticate users' identities, minimizing the risk of uninvited bots slipping through the cracks. This is particularly critical in an era where data breaches and privacy violations can have severe repercussions, both legally and financially.
Moreover, these verification options are not just about preventing unauthorized access; they also serve as a deterrent to malicious actors seeking to exploit vulnerabilities in virtual meeting platforms. By implementing these measures, organizations can maintain the integrity of their meetings, ensuring that discussions and shared information remain confidential and secure.
Microsoft's initiative is a laudable example of how technology can be leveraged to enhance security and build trust in virtual interactions. It is a reminder that while we embrace the benefits of AI and other advanced technologies, we must also be mindful of the responsibilities that come with their use.
Security and collaboration have never been more important. Schedule a Microsoft Teams Bootcamp with our certified consultants and provide your organization with training and guidance for both end-users and Teams administrators for the effective collaboration and governance of Microsoft Teams.
Comments