There’s no doubt artificial intelligence is transforming the very concept of work and, when combined with effective collaboration, the possibilities are endless. In this six-part blog series, I use my nearly 20 years of industry expertise to dive into these possibilities, imagining a world where AI meets collaboration—and work is never quite the same.
I’ve discussed how AI could help us mine for project artifacts and build a project team using resources from across the organization. But with research complete and the team built, what does collaboration look like when AI is fully integrated?
My Collaboration Epiphany
Recently, I participated in a review meeting to discuss the messaging for some upcoming launches. In the conference room with me were four of my colleagues and connected via video were another four spread out across North America. One colleague led us through a presentation, and each slide generated conversation.
As the group debated tweaks to the messaging, I thought to myself, “Is this being recorded?” A few people were taking notes, but the discussion, facial expressions and vocal nuance weren’t being captured.
As the meeting went on, we talked about subject matter experts who weren’t present but had critical information. Several questions were asked about functionality, testing schedules and partner enablement plans. These were manually turned into action items for research and follow-up later.
After giving an off-the-cuff example of how to turn the bulky messaging on the screen into simple, plain speech, I realized that while we all agreed, no one had captured the good ideas our ears heard. There would be no way to reproduce what was said, so we had lost the “genius in the moment” forever.
We needed AI-powered collaboration assistance.
Context-Aware Collaboration Assistance
Let’s take this same scenario and see how AI could have improved it in real time.
The orientation of the meeting is unchanged: still four participants in the room connected via video to four geographically dispersed participants connected remotely. This time, however, the meeting is recorded and the AI’s Natural Language Understanding unit is scanning the conversation, transcribing it in real time. Using voicemarker technology, the AI knows who is speaking and matches the speaker with their words.
As the presentation is discussed, the AI detects subject matter experts (SMEs) not in the meeting being mentioned. Connected to the corporate directory, calendars and HR systems, the AI makes action cards featuring those experts appear in a corner of the screen with a picture, presence information and an option to invite them to the meeting.
The AI understands these discussions may have an impact on the work of the mentioned individual, so it makes a notation in the profile of the SMEs referenced and adds them to a digest that publishes to an email or profile for review later.
Learn more about effective collaboration with our Team Communications and Collaboration for Dummies ebook >
Information critical to the discussion that has been indexed and databased also appears as it is mentioned. Project timelines, functional requirements documents and other reports that exist within the organization are linked in real time. Some documents, while listed, require approval from content owners; they are linked and, if they are clicked, the owners are notified for approval before the documents are available.
Since there may not be the right plans available for the meeting, the AI searches for other examples that might exist and proposes an action item the meeting owner can assign to a participant.
When the “genius in the moment” occurs, participants press a button to capture a 30-second clip of what was just said, and the transcription is annotated with a marker for future viewers that something significant happened. The archive is now flagged for future artifact searches as a key meeting in the project lifecycle.
At the meeting’s conclusion, the video and transcript are added as assets in the collaboration environment, and in any other integrated application the meeting is updated as a new content post. Important decisions are immediately distributed to any stakeholder groups as a potential project-affecting event.
Information Latency Reduction
Because there are so many meetings happening with different people, often simultaneously, when a decision is made it can take days or weeks for all relevant people to hear, validate and change accordingly. Essentially, the reason the left hand doesn’t know what the right hand is doing is because only the brain knows both hands exist.
To address this, AI could send a notification to other groups that may be impacted by decisions in the meeting, informing them of key information right away. With the AI processing changes related to projects across the organization, this whole-system approach can be managed and updated in real time, reducing response latency and keeping everyone on the same page.
This type of transformative technology is already being developed by Google for contact centers in partnership with Mitel. Watch the video below for a demonstration of Google’s Agent Assist functionality, showcasing how AI can help contact center agents solve problems faster with real-time insights.
Where to go from here
Now that we’ve mined the mountain of data to take advantage of knowledge created from projects that came before, built an AI-assisted team and had a meeting with the full resources of known information available, it’s time to think about where AI needs to be to reduce the chance that information falls through the cracks. That means AI needs to be everywhere.