Automating Gong Sales Call Analysis with n8n and AI [n8n At Scale]
24:25

Automating Gong Sales Call Analysis with n8n and AI [n8n At Scale]

n8n 13.03.2025 5 972 просмотров 113 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
🚀 Want to unlock hidden sales insights from your calls? In this video, I’ll walk you through how to use CallForge, an AI-powered automation workflow built with n8n, to extract key insights from Gong transcripts and seamlessly integrate them into your sales, marketing, and product workflows. 🔹 What You’ll Learn in This Video: ✅ Step 1: How to automate Gong call transcription and extract insights at scale. ✅ Step 2: How to aggregate competitor mentions, feature requests, and objections across all sales calls. ✅ Step 3: How to automatically route insights to Salesforce, Notion, or any tool your teams use. ✅ Bonus: How this automation empowers sales, marketing, and product teams with real-time data for smarter business decisions. 🔹 Why This Matters: Enterprise sales teams have thousands of conversations, but the most valuable insights get lost in the noise. CallForge bridges that gap by surfacing patterns, tracking trends, and delivering actionable intelligence from every call—so your team can close deals faster and build products your customers actually want. 🎯 Ideal for: ✔ Sales teams that want data-driven decision-making ✔ Product managers tracking feature requests & integrations ✔ Marketing teams analyzing competitor trends & objections ✔ n8n users looking for powerful AI-driven automation workflows 🔹 Resources & Code: 📌 Get the CallForge Workflow Templates Below: https://n8n.io/workflows/3031-callforge-01-filter-gong-calls-synced-to-salesforce-by-opportunity-stage/ https://n8n.io/workflows/3032-callforge-02-prep-gong-calls-with-sheets-and-notion-for-ai-summarization/ https://n8n.io/workflows/3033-callforge-03-gong-transcript-processor-and-salesforce-enricher/ https://n8n.io/workflows/3034-callforge-04-ai-workflow-for-gongio-sales-calls/ https://n8n.io/workflows/3035-callforge-05-gongio-call-analysis-with-azure-ai-and-crm-sync/ https://n8n.io/workflows/3036-callforge-06-automate-sales-insights-with-gongio-notion-and-ai/ https://n8n.io/workflows/3037-callforge-07-ai-marketing-data-processing-with-gong-and-notion/ https://n8n.io/workflows/3039-callforge-08-ai-product-insights-from-sales-calls-with-notion/ 📌 n8n Documentation: https://docs.n8n.io/ Want to connect? Find me on social media and reach out to me directly: - LinkedIn: https://www.linkedin.com/in/angelgmenendez/ - X: https://x.com/djangelic 💬 Have questions? Drop them in the comments! Let’s automate sales insights together. 🔥

Оглавление (5 сегментов)

Segment 1 (00:00 - 05:00)

hi welcome to another episode of NN at scale my name is Angel and today let's think about this your sales team is meeting hundreds of prospects each week each conversation is packed with valuable insights competitor mentions feature requests objections but at an Enterprise scale these insights get buried in thousands of calls how do you find the signal in the noise sure there are great tools like gong that transcribe and record your calls AI can Summarize each conversation but here's the issue AI is only looking at individual calls the real gold mine patterns across hundreds of calls what's everyone asking for what objections keep coming up what's the market really telling you what are the patterns across all these calls that your Enterprise can use to make smart business decisions here at nadn we run into the same problem we meet with Enterprise customers every week we use gong we log insights into Salesforce but we realized something was missing there was no easy way to extract aggregated insights from all of these calls and that's a massive problem imagine if your sales team could instantly see the most requested product features across all of their calls if your marketing team knew exactly which competitors kept coming up if your product team had a running list of the most requested integration requests without sifting through a mountain of notes that's why we built call Forge call Forge is an AI powered automation workflow that extracts insights from gong transcripts not just from Individual calls but across all of your sales conversations it then segments those insights for each team and automatically sends them where they belong Salesforce for sales notion for product and marketing or wherever your team works even better those insights can be sent back into Salesforce helping your team track Trends over time and make data driven decisions all right now that you know why call Forge exists let's see how it works I'm going to show you exactly how we use n8n to automate this entire process from gong transcripts to actionable insights across your entire company let's dive in when we talk about call Forge what we're really talking about is a series of automation nadn workflows that work together to perform all these tasks so there are currently eight different workflows as you can see in this list here that work together to perform the duties of call Forge now the reason for this is because what we want to do is we want to M make this as modular and scalable as possible so there's a lot of work that's done behind the scenes to prepare the transcript from gong into a format that is usable within the AI llm now to that end what we're going to be doing today is going through each of those workflows some of them in more depth than others with the end goal of better understanding how to deploy deoy this in our own environment now keep in mind that while we do this I have created a demo environment I have randomized the information to make it easier to consume and for security reasons of course so I won't be able to show it end to end from beginning to end just because the calls contain information that is sensitive so I don't want to show that on this call however I will be showing each workflow with data to show how it's supposed to be processed so let's Dive Right In so I like to start with what I call my Kong call trigger demo so the gone call trigger is how we get these phone calls into the to trigger this workflow to begin with so this is done via Salesforce so what we do is we run an hourly cron trigger here to check our Salesforce object as we can see here let's go ahead and open up our Salesforce node here and what we've got here is a condition that looks for the past here I've done four hours but typically it runs at two so it looks in the past two hours worth of phone calls and I have two levels of D duplication so my goal here is just to make sure that even though it's running hourly here we're pulling 2 hours worth of phone calls now this is working with the gong Salesforce integration you don't have to do it this way you can actually trigger directly through gong itself and skip this whole trigger workflow completely however this allows us to do some filtering and make sure that the phone calls that go into the automation meet certain criteria in Salesforce before being processed so here again the integration once a phone call is completed it synchronizes to Salesforce as a custom object within Salesforce that object is then stored within Salesforce itself and what we're doing here is we're looking for any of those objects that were created in the last two hours

Segment 2 (05:00 - 10:00)

now we then sort it by date we check to see if the opportunity stage meets our criteria so it's equal to Discovery or meeting booked and then once and then we check to see if the primary opportunity is not empty so we want to make sure that there's an opportunity in Salesforce before we process it so let's go ahead and run it up till here and as you can see the goal here is to format it into this object so here we go so this is a fake company Zenith again so again we're censoring this information or randomizing this information for the sake of security here but the point is that if multiple objects come in here they are then sent into our call preprocessor now our call pre-processor let's take a quick look at that so here is our pre-processor here is where we start to bundle the in the gong call information with information about our organization so things like the Integrations that we already currently have deployed within the product itself are competitors that already exist so what I don't want is to pull that information for every single phone call because each one of those are pretty big and it can create a bottleneck so what I want to do is generate it once in bulk attach it to the Json objects themselves each of the phone calls and pass those into the llm organically one at a time but only having to do one API call per hour of the integration and competitor databases as you can imagine they can be pretty lengthy now this is useful because in the transcripts there's going to be tyos mispronunciations and what this allows us to do is we can prompt the llm saying hey if you see something that's similar to the names of one of these Integrations or similar to one of the names of these competitors assume there's a typo and go with the information that we're providing here instead of the information that's being passed in so this allows us to error correct on the Fly by providing time data to the llm and pass that information into it that's what we're doing here with these Aggregates what we're doing is we're bundling the information into one Json object so that it has different sections within the Json object that allow us to segregate the information within each call so let's go ahead and run it I'm going to run it up to here because I again don't want it to fully run but what we're doing is we're going to go ahead and pull those Integrations getting all those competitors we're going to check the notion database to see if there are if it already exists we compare the data set and ignore it we don't want it to continue down here because it already exists we've already processed it this allows me to run this in a modular fashion so I don't have to worry about if I if there's an error I can come back to this level of the workflow and run it from here or the next level I can run it from The Next Level depending on where I want and it will not duplicate data which is my hope here so once we get to this Loop what we're doing is we're passing it into our transcript processor now at this point this is what it looks like so here's the call data here's the integration data so as you can see we've reduced it to a long string of Integrations and then we have our competitor data so here we're going to be pulling this from Salesforce in the future this is just a small list of competitors we were processing manually before so let's take a look at the transcript processor this one is a little bit more complex but the goal here is to enrich the data and to take the conversation and set it up in a way that it has a breakdown of who the internal speaker is and the external speaker this allows us to Target our insights only at the external speakers so things that a sales rep might say for example it can ignore because it can know that's contextual information not so much that it is information that is specific that we're trying to glean from the customer so the final output so this these two code nodes do the hard work these I'm using the HTTP request node because that was before we were able to we had the gong transcript node which we do have now but this gets it in a format that we need so if we pull it open here and so again this is not real information the names and stuff have been changed for their protection here but here what we have is we have internal and we have our external and even unknown is also if somebody joins that isn't part of it we still use external and unknown as part of our prompt so the goal here is to generate this long conversation string that allows us to get what we're looking for so let's go ahead and take a look at the next step so one once it's been prepared we go back up a step we're

Segment 3 (10:00 - 15:00)

going to then Loop through all these calls and then go into our all data so this is a no operation it doesn't do anything per se it just it's a way for me to aggregate everything before we pass it into the sub workflow where the work is actually done so let's go ahead and take a look at this one this is execute process for calls or the call processor so here what we're doing again is we're doing a check to see if there's any duplication of data if there's duplication we need to get rid of it once we do that we're going to go ahead and post a message on slack saying hey I've started processing X number of calls so once those calls are processed or once that notification is sent what we're going to do is edit this message in real time here to change the index of what it is so it's like a progress bar so it'll say call one of 20 process call two of 20 call three of 20 until it's completely done and it posts the final message calls complete the other thing we do is we need to create a parent object to store these in the database so we have multiple databases that are being used to Output the information so we have one an a database for the calls themselves summaries of those calls we have one of use cases of customers using the product itself we have one for Integrations one for competitors so all these different databases but they need to link back somewhere so this right here creates a parent object that parent object becomes the home for all those or link for all those other database objects to link back to so we need to do that before we pass it into the AI so once the data is created it has a home for it to go back to so we merge the conversation data along with the notion data here so both of them can exist together and then we pass these into the AI processor which is why we're here today so if we run it up to here we we'll go ahead and see it running in real time here so there we go it's creating it it's created the notion object we'll take a look at that in a moment and there we go so AI here's the AI team processor so let's take a look at that is where the magic happens with the AI side so in here let's go ahead and run it up to let's see this whole thing if I'm not mistaken we've turned off the connection because we don't have the notion object already created here so let's go ahead and just run it let's see if it works we'll give it just a second okay great that worked good and this passes it back into Salesforce now perfect it worked great so now we do have one thing to cover here why are we using Azure instead of olama or chat GPT for example so we're using Azure because here we're able to find a balance between security and running on Prem so on Prem we need to run our own Hardware we need to make sure that the agents are up to date everything is running correctly that can take time and effort whereas if we use it through chat or open AI rather that creates problems of its own we're not guaranteed anything in terms of data security which could create problems if our information is leaked or stolen so by using Azure we find a good middle ground in terms of making sure that we're getting the information but we're not leaking it somewhere else so some of those reasons are security and network isolation they use Virtual networks private endpoints encryption role-based access controls they're sck to compliant they hipa compliance fed ramp ISO 2700 or ISO 27001 gdpr things like that data privacy so it stays private and encrypted in Azure in Microsoft's platform versus open AIS which you don't have those same guarantees and it also works with Enterprise Solutions so definitely worth it let's take a quick look at the prompts themselves so here as you can see we have and we do I do have this pre- prompt here but what I have here is we have the user prompt right here in a central location that is generated and passed to all three because it stays the same let's take a quick look at that first so here as you can see on the right analyze the following call transcript between NAD and sales representative denoted as internal and external attendees denoted as external or unknown provide the following details in a structured Json format and this is key right here please note that a company naden is sometimes incorrectly called naan naton Nan nnnn or Nathan in the transcript so keep this in mind when reading the transcript so this allows us to keep any types of future for example

Segment 4 (15:00 - 20:00)

typos in one place so all I have to do is update this and it automatically updates but all three of these user prompts which is very handy as you don't want to have to be constantly fixing that the other thing is we give the company domain the call Title internal and external names here as you can see due to potential errors and transcripts here's a list of our competitors current Integrations and as you can see that's a big one and then the actual call itself so as you can see the call transcript again this is a randomly generated transcript it's not a real one but it's formatted in the way that it is normally processed the key here is that the output once it's generated is then sent into each of these along with a system prompt that is specific to each department so here as we can see under system message you're an AI assistant specializing in analyzing sales calls transcripts your task is to extract structured information about the call including use cases objections summaries and other relevant insights so here as you can see you are an AI agent AI assistant specializing analyzing sales call transcripts your task is to extract structured information about the call including use cases objections and summaries and other relevant insights for the sales and marketing teams pay close attention to action-oriented language and specific requests made by the external participants you have no tools do not attempt to use an AI tool that's an issue specific to Azure sometimes since it has the word tool in here it tends to look for a tool but with this it tends to fix it so there we go so identify use cases so all of these sections tie in to this output parser so if we take a look at the output parser I've created a Json output showing what I'm looking for and what it does is it outputs that so here we go an organization is aiming to understand how to efficiently scale their workflows external team Express concern about pricing structur in relation to scaling usage during the call team from Zenith various technical evaluations so what we're doing is we're putting these in a structured output that we can then send into the next step the data processor to get it into our dashboards into our database in a way that is consistent that consistency allows us to create visualizations that help grow the organization that can help create charts and graphs to help give smarter business insights so what we're doing here is we're taking the prompts here structuring the output here we're pulling we're giving the information a format that makes sense and then we're passing it into here to process it so to give you an idea let's take a look so this is what this looks like so as you can see we already have as part of the initial bundle of data the notion ID of the parent object that was created so it's passed into here let's take a quick look at that there we go so here the data is passed into this workflow and then it is passed into the other databases like for example we have an objection data processing section that handles the objection summaries things like that and it links back to the to parent object to ensure that all of them work together so it's similar across all three of those outputs so all three of them are built the same way and this is what it looks like so let's go ahead and take a look so here's the one we just created again since it wasn't linked we won't be able to see this one specifically but here's some randomized data to give you an idea so as you can see here we can take a look by opening up the object and take a look at the call summary we can take a look at customer pain points next steps objections so we can quickly see what objections are being seen marketing insights for our marketing department actionable company size and recurring topics so that we can look for patterns along with the Salesforce information so that we can look up this information if needed or even for example the gong call if we need to dive into that gong recording as a as an organization but the power in this is the graphing so what we can do is take a look at things like call over time by sentiment and see if there were negative calls new neutral calls or positive calls all these of course are positive additionally we can do things like seeing who participated so let's see here P here we go so call over time by AE so we can see which sales reps are doing how many calls per week per month per whatever and get a good idea of which sales member which sales team members are working harder than others or processing more data than others which is nice to have additionally we can see things like AI versus non-ai

Segment 5 (20:00 - 24:00)

calls so telling you as you can see AI is huge right now so a lot of the calls are related to AI in some way we can also see things like Integrations database so I can see which Integrations we already have that are popular so as you can see slack in the Enterprise world is one of the more popular ones followed by Salesforce postre these are ones that we've already integrated into our product so we can start as a marketing team for example start working on data or blog posts or content that are specific to these tools and ones here this can give us an idea of ones we don't have that we might want to focus on integrating in the future so 11 Labs one password Discord so the this is randomized data so this is this one at least is incorrect but it gives you an idea of which ones we want to focus on for the future in a way that's Visual and easy to process same with competitors so if we look at the competitor chart make is our biggest competitor followed by zap zapier and Lang flow wise all these additional competitors here so again giving us the ability to see this information now we're using notion to store this information but you can use any platform you'd like to store this these databases the point is that as long as team members are able to view it turn it into an actionable steps that's where the key and the value lies here product feedback our product team utilizes this quite a lot so we have negative feedback sorted here we have product sentiment so positive versus negative positive feedback they can look at it all here so again this is Gen randomized data that was generated just for this demo so it's not accurate in terms of what we actually see but it gives you an idea of how it's formatted and how it can be used and then marketing insights are very useful for us and then recurring topics so things like words or things that we're consistently seeing across this and last but not Le least actionable insights things that we want to work on the LA the one that I actually wanted to show was use cases so use cases allows us to get an idea create like a gallery of what different companies are using our product for and generate it already in a way that is censored so that it can be used to help other customers get ideas of where they can grow where they can make their own product better so that's the workflows in a nutshell so each of these work hand inand if you are trying to deploy this internally you will need to deploy all of these first make sure that they all work separately and then connect them back together so I hope that this has been helpful I hope that you're able to deploy it if you have any questions feel free to post them in the comment section all right let's take a step back and recap what we just walked through we started with a simple but big message how do you extract meaningful insights from thousands of sales calls we explored how a call Forge automates this process using nadn to pull transcripts from gong process them with AI and distribute these insights to your sales marketing and product teams all without having to sip through endless notes and we didn't just automate the obvious we structured the AI prompts to extract actionable data not just summaries additionally I wanted to clarify one thing from the AI prompt section you'll notice we had two inputs we had a user and a system prompt the system prompt defines the ai's role what kind of assistant is it supposed to be here is where you also create the structure of your output here is where you're going to also want to give it examples of what your structured output should look like while the user prompt tells it exactly what it should be looking for the what this distinction is crucial because it shapes how the AI interprets and prioritizes the data now if you're wondering how do I actually deploy this in my own stack good question call Forge is designed to be modular and flexible so you can integrate it with your existing tools want to bring in external Secrets manage different environments or optimize the workflows in Q mode those are some of the Enterprise capabilities we'll explore in future videos so if you found this useful make sure to like And subscribe because in future videos we're going to be diving into how to fine-tune your AI prompts deploy workflows and maximize your Enterprise Integrations if you're ready to deploy call Forge yourself check the links in the description below you'll have everything you need to get started there I can't wait to see what you build drop your questions in the comments and I'll catch you in the next one

Другие видео автора — n8n

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник