Build An AI System That Finds Viral Content Ideas (N8N)
1:36:47

Build An AI System That Finds Viral Content Ideas (N8N)

Nick Saraev 11.05.2025 17 460 просмотров 505 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
Join Maker School & get your first automation customer + templates ⤵️ https://www.skool.com/makerschool/about?ref=e525fc95e7c346999dcec8e0e870e55d Watch me build my $300K/mo business live with daily videos + strategy ⤵️ https://www.youtube.com/@nicksaraevdaily Summary ⤵️ Built a free AI system that finds viral video ideas in any niche—just plug it in, set it up for yourself, or sell to YouTubers/content creators! My software, tools, & deals (some give me kickbacks—thank you!) 🚀 Instantly: https://link.nicksaraev.com/instantly-short 📧 Anymailfinder: https://link.nicksaraev.com/amf-short 🤖 Apify: https://console.apify.com/sign-up (30% off with code NICK30) 🧑🏽💻 n8n: https://n8n.partnerlinks.io/h372ujv8cw80 📈 Rize: https://link.nicksaraev.com/rize-short (25% off with promo code NICK) Follow me on other platforms 😈 📸 Instagram: https://www.instagram.com/nick_saraev 🕊️ Twitter/X: https://twitter.com/nicksaraev 🤙 Blog: https://nicksaraev.com Why watch? If this is your first view—hi, I’m Nick! TLDR: I spent six years building automated businesses with Make.com (most notably 1SecondCopy, a content company that hit 7 figures). Today a lot of people talk about automation, but I’ve noticed that very few have practical, real world success making money with it. So this channel is me chiming in and showing you what *real* systems that make *real* revenue look like. Hopefully I can help you improve your business, and in doing so, the rest of your life 🙏 Like, subscribe, and leave me a comment if you have a specific request! Thanks. Chapters 00:00:00 Introduction 00:00:18 Demo 00:01:44 Live build 01:36:02 Outro

Оглавление (4 сегментов)

Introduction

Hey, in this video I'm going to be building a YouTube trend detector live. This is a system you can use for yourself if you're into personal branding, or you could sell directly to coaches, consultants, people selling info products themselves, and so on and so forth for over $2,000 a pop. I'm going to be building this live alongside you, show you guys all of the detours and mistakes that I take along the way. Let's get into it. All right, so this is

Demo

future me doing a demo of the system. I've gone through a bunch of rigomeroll in order to get this put together, and you guys are going to see all of that. Um, in a nutshell, this is going to be two separate workflows. One to add or update new trending videos and the other to take everything that you've added and updated and then to send it in a nicely formatted email that I'm calling the daily digest. So, if I click test workflow, the first thing that's going to happen is it's going to pull from a Google sheet database of channel IDs. It's then going to grab YouTube videos from the YouTube API before dumping those into the Google sheet. And then what we're going to end up having is just a list of new videos here alongside view counts. Now, I'm doing this for a couple of channels, but essentially after we're done with this, this YouTube trend detector can then turn on. And when this happens, what we're doing is we are then subsequently reading through this on a schedule, maybe once every couple of days or something. If I go to my Gmail, you'll then see that we now have a list of high quality videos over certain multiples that are then organized really nicely for us. And you know, we put in the channels that we want to track ahead of time and so on and so forth. But yeah, this is more or less like a simple and easy way to do things. I'm going to run you through exactly what the logic for this looks like. Maybe if you wanted to extend it, I don't know, build a website doing this, recreate one of 10 or whatever. Okay, so let's do the live build. Okay, so let's start with the

Live build

live build. Here's the current road map. And what I'm thinking about how to get started and then finish this. But what I'm thinking is we're actually going to divide this into two separate flows. The first flow is going to be the ad and the update flow where we're actually going to grab the data directly from YouTube. And then the second flow is going to be the daily digest flow where we basically just send a summarized version of all of the trending content. And in this case, I'll just use an email. But in reality, you can think of this as being deliverable through more or less any means that you want. You could do like a Slack update. You could do SMS. You could spin up a beautiful user interface. So you could have a website and I'll run through each of these in kind, but I just wanted to mention that I got the initial idea from Leonardo Gregorio. He showed me a trend detector that he was using to identify AI and automation related content to find trends that he could, you know, jump on trends. And he's taken a very sniper rifle approach to all this stuff. The guy's grown from basically zero subs all the way up to 20K extraordinarily quickly, much quicker than I did when I started. So um he developed this idea of a YouTube outlier detector based off multiples. Um, and I believe he got the his idea from this website here, one of 10, which basically does all this stuff in the background. And it's like a SAS product. And the idea is what we're going to do is we're going to rebuild or recreate a lot of the same functionality of this app. And then Leonardo's, except instead of using in his specific case, he used I think it was like SQL. I'm just going to do it all inside of a Google sheet just cuz I think SQL is kind of scary and intimidating a lot of beginners. And I just want everybody to like I want everybody to have as simple and as easy and as straightforward a time as humanly possible with this stuff. I personally don't really think we need to use SQL for it. So, with all that said, here is more or less what I'm thinking. For the add or update flow, we're going to start by getting all of the videos for a specific channel. So, basically, we're going to have to add a list of channels that we're monitoring. From there, we're going to grab the individual video data using the YouTube API. And that's just how the YouTube API works. You can grab all videos in one call, but then you don't get a lot of information about each video in that call. You just get a list of IDs. The second step here requires us then to ping each individual video to grab data like views. What I'm going to do next is I'm going to filter all long form videos. So, you know how YouTube you can do shorts or you can do like longer videos like my style. Well, you know, we kind of need to compare them apples to apples. So, I'm just going to filter out shorts. Unfortunately, there's no built-in way using the YouTube API to do this. So, I'm going to do a heristic or sort of like a proxy for shorts. And I'll run you guys through what all that looks like later. Then, I'm going to check if it exists in the database. Database being the Google sheet here. And that's just a fancy term for that. If it doesn't exist, we're obviously going to add it. if it does exist, we're going to update the metrics and stuff like that with the new view count because presumably views change. And then once we have our little database set up, like our Google sheet, what I'm going to do is, you know, once a day or once an hour or I guess just however often we want, we're going to send over some sort of digest. And a digest again can be anything. In my case, I'll just do a quick little email just cuz I think that's the straightest line path. So what's that going to look like? Well, because I'm using a Google sheet, I'm going to store all this data in different sheets. So I'm going to grab all the sheets. Then I'll grab the videos in each sheet. And then for each video list, I'm going to calculate the average number of views. This is sort of how you determine the multiple or how trending a piece of content is. You compare the view count of a specific video against the average view count of all of the videos. Then for each video, we're going to determine the multiple on that. And then if the multiple is over the threshold, we're going to include in the email. So, I like this idea because if we combine these two systems, right, we have something on the left here that's automatically updating the metrics and then something on the right here that automatically checks to see if a multiple is below some threshold. Presumably, these two things are going to make the system evolve and be dynamic. The videos that come in on day one aren't necessarily the videos that are going to come in on day two. And far from being like a negative of the system, I think that's actually a positive cuz sometimes videos get rediscovered later on. And I think that if you want to really assess the performance of a video, you can't just look at everything static like, you know, today or tomorrow. You actually have to look at it as it evolves over time. All right, so that is the whole idea here. Let's actually jump in and build this puppy. So, I got my little YouTube trend detector here. I was just doing a little bit of um wireframing beforehand to make sure that like the YouTube API worked and like logically I could actually hook up my credentials. But aside from that, this is going to be entirely lively build. So, I'm going to create a new Google sheet here and I'll run you through how to do all the connections and everything like that you need as well. Let's uh remove that. I'm just going to call this like YouTube trend detector. Let's just say database. Okay. All right. That seems pretty solid to me. Uh what I have to do is I have to connect this database now. So what I'm thinking I'm going to do is you know how I mentioned we're going to have a list of channels that we're monitoring. So the very first thing is I'm going to make a table called channels. Then over here I'll just have it say channel ID on YouTube. What you do in order to get all the data about a channel is you need their ID. And if you're unfamiliar with how that works, I'm going to go over to my channel here. You can grab the ID of Sorry about that. a YouTube channel just by going uh I think more and then all the way down to the bottom. Share channel, copy channel ID. Okay, most people now use like little acronym versions like I do, Nicks as opposed to the channel ID. So you can't grab that through the URL for a lot of channels, but if you find yourself in that situation, you can get it from there. Okay, so what I'm going to do is I'm just going to test all this stuff out on one channel because, you know, that's really all that matters to me to start. Then once I've tested it on one channel, then I can worry about dealing with all the other channels. And I'm just going to brainstorm everything that I'm thinking about live so that even when I do end up in a detour in some sort of crappy hole. You know, you guys will see how I do the debugging of this as well. So, first thing I'm going to do is add a trigger where when I click this test workflow button, it runs the flow. And that's pretty simple. Second, I'm going to use a Google Sheets node. And what I want is I just want some way to grab the data. So, I'm going to use the get rows in sheet node. Here I have the ability to add my credentials. Now, if you haven't added credentials before, I'm going to show you how to do it for YouTube in a second. In Google Sheets, all you do is you click oath 2 and then click sign in with Google. Okay, very straightforward, very simple. I've already done this, so I'm just going to close this out and then select the credential that I have, which I'm just calling YouTube. The resource is going to be sheet within document operation get rows. And then the document that I want is going to be this one I just created, YouTube trend detector database. The sheet that I want, if you think about it, is channels because I'm just going to select from this list of channels that I'm monitoring. And that's how we're going to build the flow. Okay, then I click test step. Okay, what am I doing? I actually now have got the data from the Google sheet into NAD. So, we are good. Next, I'm just going to pin the data. And the reason why I'm pinning the data, and I always recommend pinning Google Sheets steps, it's just because, you know, when you turn it from green to purple, um, instead of having to do the API call to the Google Sheets API again, what you can do by pinning it is just like cache or persist the data directly in NAN, which means that for all future runs of this, like if I want to test the workflow, it actually just automatically grabs that data and then runs it through. I don't actually have to like physically make a request to Google. The reason why this is valuable is because they tend to be very fragile, these APIs. So, if you always test every 3 seconds, like I normally do, I'm very um incremental with my testing for good reason, which I'll tell you about in a minute. You know, sometimes the API gets overwhelmed and then you end up just having to like wait like 5 minutes. Who the hell wants to wait 5 minutes, right? Okay, so next up, now that I have the channel ID, if you think about it, I kind of want to grab the video. So, I'm going to go YouTube right here. And uh there are a lot of different functions I grab a channel, get many channels, updated channel, upload, channel, banner, playlists here, playlist items. Okay. So, what I want is the get many videos. Now, you see it'll say credential to connect with YouTube account. So, I've already done this, but I'm going to pretend that I have them. I'm just going to set it up from scratch for you. Okay. So, when you click add connection, it'll say OOTH redirect URL. And then you'll grab the URL here, and it'll have this little callback thing. Don't worry too much about this. This is just like a way that it opens the window up in NAN. What you need to fill is this client ID and then this client secret section. If you don't know how to do any of this stuff, just click open docs. Naden actually has pretty good docs on how to get up and running with like service accounts and whatnot. I'll run you through what this actually looks like. What you have to do is you have to go like console. cloud. google. com just like this. And then what you have to do is you have to make a project. Now I've already made a project. So I'm at my first project here. Okay. But what making a project does is you basically just give it a name. So as you see my website here is leftclick. I basically just gave it a name and now it says my first project. What you have to do next is you have to go to APIs and services. Then what you have to do is you have to go to YouTube. And when you go to YouTube, you'll find the YouTube data API. In my case it's V3. Maybe you're watching this video in 2027 after the robots have won. So, uh, maybe you are a robot, in which case, please spare me and my family. Those will be a this might be a V something else. Okay. Uh, you're going to want to click like I forget what the verbiage is, but I think it's like, you know, add or enable or something. Once you're done with that, if you go to manage, then you'll go down to credentials over here on the right hand side. And then what you what you'll have is you'll have two sections. You'll have like um API keys and OOTH 2. 0 client ID. Now, I've actually already created my own credentials here quite a while ago for uh for YouTube and whatnot. What you can do is you can go oath client ID and then here you actually create your own. So what I'm going to do is was it web application? There we go. And then the name will be whatever you want. Whatever I want. Okay. Um under authorized JavaScript origins and authorized redirect URIs. We're going to go back to here. Go back to the YouTube or other Google specs. And then what we want is we want this OOTH single service. So now it's going to walk us through all these steps. Figure out OOTH consent screen. Let me see. From your NAN credential, copy the OOTH redirect URL. Paste into the authorized redirect URIs in Google console. Okay, great. So what that means is we go back here. You see how it says OOTH redirect URL. You got to give that a copy and then go back over here to where it says authorized redirect URIs. We have actually to paste that in. Okay, once you're done, click create. Now you're going to have two things. So, I have a client ID up here at the top, which we're going to copy. And I also have a client secret. So, what I'm going to do is I'm going to paste in the client ID here and go over here and I'll paste in the client secret here as well. And you'll get this signin with Google box. Now, after you're done with that, this will now open up a Google signin window. Then, click the email that's associated with the account that you just created. Go down to allow. All right. From here, it is now connected. You can close your window. And you've actually now done the connection. Remember that first step where you have to set up that cloud account. What you have to do is you I think they give you like 300 bucks in free credits or something like that. You functionally will not run out of credits. I mean you know your free trial is over but your cloud platform journey doesn't have to be. I think you can like continue doing your API calls below a certain limit or something like that. Anywh who uh from there credential to connect with is YouTube account 2 resource video operation get many return all. I'm just going to have the limit be like three videos for now. Filters we'll leave channel ID. And then what I want to do is just feed in the channel ID directly in here. So what this is this is just like hooking me up to a specific channel as we see. I'm just going to click test step and we're going to see what happens. Okay, awesome. And it looks like we've now received a bunch of data. That's pretty cool, right? So we've now verified that we can do a fair amount here. And if I just go back to my little road map here, we've now verified that we can actually get all the videos for a channel, which is great. Okay. All right. So now that we've gotten all the videos for the channel, what I want to do, well, not all of them, but three of them. I'm just going to pin this data again. So now I have access to these three items. And now what I'm going to do is I'll go back to YouTube. And logically what I'm going to do next is we're going to get a video just like this. Now you'll already have the credential that we added. So this is the second one, the one that I just created. So I'll go there. And then you see where it says video operation get. Well, now we need to feed in the specific video from the giant list of videos that we just got. And you'll find this I think here at least. Yeah, I'm pretty sure. That looks to me like a video ID, right? Okay. So now if I click test step, it's actually going to run on all three of these items. But to be honest, when I test APIs, I only really like to run it at one at a time. So, I'm going to click on this button in between. I'll just type limit. And what this does is this basically just limits it. So, if there were three items initially, now there's only one item. So, we're only grabbing the first in this case. You can also go last if you want. I just go first. Okay. So, now what I'm doing is you see how on the purple it said three items up here and then over here it says one item. Well, basically that's what this limit node did. It just like took those three and then it like just converted it all into just one. It didn't merge them or anything. I guess what I'm trying to say is it just like deleted the last two. So now that there's just one item as input, when I run this video ID, it should only run once. So I click test out. So I'm going to grab a specific video. And that's what it did. It just ran once for one item. Now what's interesting, I'm going to go to see schema view here. It's probably easiest for you guys to understand. So we go back to schema view. What we see now is we're getting a ton more data about the specific video. Like on the left hand side, do you see how there's like no data about the specific video views or anything that I could use to determine if it's an outlier? Well, on the right hand side, we get that data. So, there's a bunch of thumbnail BS. I'm just going to close that. Tags, which is fine. Category localized. This is the title and description. Content details. Okay. So, this is um this over here is going to be important for us. PT32s. This is interesting. Uh this is like a timestamp string basically showing how long the video is. In this case, this is 32 seconds. So, I don't know what P stands for. I think T stands for time and then 32 is obviously the number of seconds. Um, but this could also be like PT 5 minutes, PT3 hours. This is just like the specific timestamp formula that uh, for whatever reason YouTube uses. I don't know why they don't just do the number of seconds. That make everybody's life so much easier, but they use a five U character string for seconds, which is annoying. But the thing is, if you think about it logically, like I don't want to grab shorts, right? So, I'm going to have to do a little bit of math here to uncouple this. And I'll run you through what the math looks like later. But anyway, the statistics are what we want, right? See how it says 12,690 and this one says 382 likes. So you can run outlier detection in a number of ways. Probably the simplest way is just views. But if you think about it, you could also run it on like views and likes. You could run it on just some multiplicative number here. You could like I don't know, maybe mathematically you think that one like is worth 10 views just in terms of like its viral power. So what you do is you actually take likes, you multiply them by 10, and then you add them to views. And that's what you do to score them, right? This is totally for free. This is up for grabs. You can do whatever the heck you think based off of your knowledge determines whether a thing is more viral than something else. In my case, you know, I just want to give you guys a simple nugget system. I'm just going to use the view count, but anyway. So, yeah. Okay, we got a ton of data here. So, what I'm going to do is I'm going to pin this output. And what I want to do now is I actually want to filter out all the shorts cuz I hate shorts. I think shorts are not representative of this stuff at all, right? Like you could have two systems. One that operates off shorts and long form. But you can't compare them apples to apples. They're so different. There's so many like discoverability issues and stuff like that. So, what does that mean? Uh, basically that means I have to filter the shorts. There's nothing in the YouTube API, which is really annoying, that says whether something's a short or not, blows, but they don't just have like a simple type short. So, what you have to do is you just have to infer it based off of the logic. So, realistically, if something's less than 60 seconds, I'm going to call it a short. And then if something is over 60 seconds, I'm going to call like a regular, you know, normal video for welladjusted human beings. So under content details duration PT32S, I need to somehow take this string and I need to use it to determine whether or not it's a short. Well, the way that this works, I know for a fact S means second and then if it's a minute, it'll be like PT5 minute 32 seconds. So this here, this string would mean the video is 5 minutes and 32 seconds. I think if it's like 3 hours, it would be 3 hours 5 minutes and 32 seconds. What does that mean? I can actually just use the length of this thing. If this thing is like five characters and then the last letter is S, odds are this is a short to be honest cuz the second you get over 60 seconds it just changes to minute, right? So I think that's the logic I'm going to use. I don't know if it's 100% but we'll give it a try. So how do you actually do this? I'm just going to use the filter node and then I'm going to feed in the scrolling all the way down here the duration. I'm just going to go. length length here and I was like if this is equal to five and the last letter so let's go to expression if this ends with um s then I know that it's not a short uh it is a short okay so logically I'm actually looking for the inverse of this can I do the inverse of this. How do this? I guess I say is not equal to. So this has to be not equal to five. Sorry, I'm using string here, but I should be using number. There we go. So this has to not be equal to five. And then this last thing has to not end with s. Okay, that makes sense. So assuming that these two are true, odds are it's probably not going to be short. So I just ran it and uh it says kept zero discarded one. So that means that basically this node returned nothing, right? Because there was one item here and then it hit my beautiful sexy filter, my anti-short terminator and then you know there was nothing that wasn't a short to remain. So if I want to continue testing this flow logically, what do I have to do? Well, I kind of have to fill it with like real data, right? So I'm just going to change the limit node and then just cross my fingers and hope that I can return more than just a short. Let's go three this time. I'm going to unpin it. I'm just going to go where's the PT? Right. PT get all videos three items. Now we're returning three items. Now for each of these I'm just going to test three times. Bang. Bang. Okay. Now we've done three. And um underneath this duration. This one's PT32s. If I go to JSON, I should be able to get all of them. Right. So PT32S. Where is that? Okay. So this is that short. That's what number one. This next one is 38 seconds. This last one's 59 seconds. Oh jeez. You know what I'm realizing? I think this is like This actually sorts all of my videos by shortest to longest. So obviously the first three are going to be shorts. Uh that's brutal. Can I like not do this? Hm. Is there some way to get all videos and then sort it by what do I order it by? Relevance. Date. You know, screw it. Let's just do date. This is This should fix it because I haven't posted shorts in a while. Let's test this again. Okay. All right. Yes, this is uh relevant. He quit his job after Z. I just published this one. So cool. We can now pin this. So I just clicking the node, pressing P. Then I'm going to do limit. And now I know for a fact that the first item is going to be good. So I'm just going to change that to one. All right, I'm liking this. I'm going to go over here. I'm going to test this now. Should run once. Cool. And let's see the duration. You guys see where it's 39 M26s, right? So this should work. I'm not going to pin this. And now when I run my wonderful filter test. Oh, hold on. It's still discarding it. Why is it discarding it? Um h something to do with my math here. So conditions is not equal to five. So that's good. And it does not end with us. Oh. Oh yeah, obviously it ends with S. Duh. Oh, okay. Sorry. I think I can actually just get rid of this. My bad. My bad on the S stuff, guys. Yeah, don't do the S stuff, right? That makes sense. Cool. Well, I said I'd keep in dumbass detour, so there's one. Uh, yeah, obviously all of them are going to end with S because that's how they do it. They go like PT39M26s, right? Well, actually, if you think about it, if something's like 4 seconds, it's probably going to put in here, right? So, instead of is not equal to, what I should do is I should say it needs to be greater than five because then it'll be at least 1 minute, right? That makes more sense. That's more logical. Okay. Yeah, that's way smarter. Cool. So, now that we have the video, question is, what do we do next? Say, I'm just going to go back to my thing. We now grab the individual video data. We've now filtered out long form. Now, we need to check if it exists in the database. Now, here's the thing. We don't actually have the database set up. So, it's kind of like uh chicken or the egg, right? So, first of all, I'm going to change this to one. And then I'll call this add update. Then later on I'll change um I'll add the other workflow and that'll be like daily digest. So if I go back to my little database over here my fledgling DB uh what is it that we have to do logically? Well I think what you should probably do is we have to check to see okay so here's how we can do this. You know this channel ID, what we could do is we could make this the title in here. We'd store all of the data of the video, right? Right. Um I don't know, even multiple and stuff like that. But we need some way to like add the channel ID automatically. So I'm just going to go to sheets. I don't actually know if you can do this. Can you just like get all the sheets or something? H looks like you can create a sheet. So that's interesting. You can also get rows in a sheet, delete rows or columns in a sheet. Is there some way that I could like check to see the sheets? Logically, that would make sense, right? Can I just get rows in sheet? Check this out for me. Yeah, I'm not seeing a way to, which kind of blows. H I think what we'd have to do realistically. So you have to do one of two ways. One, we rebuild the whole database every single time that we run this, which would be very computationally expensive. It would hit the API a lot. It would just not be smart. Or two, we need like an initialization thing where every time we initialize it, like we feed it in a list of channels, then it initializes the whole sheet for us. And then we do that anytime we want to update it. Or what we do is we just have a simple SOP where anytime I want to duplicate this, I grab the sheet like this. Okay? And what I do is I just duplicate this and then I change the ID to, you know, new channel ID. That makes sense to me because then the system would automatically just start dumping into the new one. Okay, cool. I think that's probably what we're going to do for simplicity. I want you guys to know there's a million one different ways to do this and I am a very hacky human being, so I prefer the hacky approach. Okay, so what am I going to do with the Google sheet? Well, now that we've identified that it's not short, obviously we need to add it, right? So, I'm going to go sheets. I'll go append or update row and sheet. I'll go YouTube resource operation. Okay. Document from the list. I'm going to go pick my YouTube trend detector database. The sheet I'm curious about. Notice how I have the channel right here. Right. This is where it's going to get a little trickier and this where I'm going to need to go back and update my logic probably. But I'm going to need to use the ID connected to the specific channel where I'm adding the video. Okay. The thing is we don't currently have any columns yet. So we have to do now is we have to map all of this data or all the data that we're actually interested in that is. So what data am I actually interested in? Well, um obviously I'm interested in a couple things. So the ID of the video, right? So I'm going to add ID here. This is going to be probably like my unique identifier, right? Like this is the one thing on all videos that's always going to stay the same per video because if you think about it, people can change your title. People can also change their description. Like this is the thing that I have to like use as my unique thing. And all databases need some sort of unique thing. So I'm going to do the ID for sure. What I want as well. Published at that sounds like it's good to keep track of. Channel ID. I mean like I kind of have it here, but I don't know. I'll just I'll add my channel ID. Maybe it's just going to be easier for me. Also, I'm realizing that I'm changing my conventions. There's two major conventions in programming. There's um camelc case, which is where you go something like that. And then there's I think it's called snake case, which is like this. Uh and I'm changing my conventions right now. Like I'm going from camel case to the other thing. I think most things in NAN use um snake case. So I'm going to do that. That seems simpler. Okay. So next up, what do we need? We obviously need the title. That seems good. Description. I mean I feel like the description is good to keep track of. Let's do the description. Screw it. Thumbnails. Hm. Looks like there's three types of thumbnails. There's like small thumbnails, medium thumbnails, high, and then standard. So why don't I do Why don't I just do it like this? Small thumbs, medium. Uh, sorry. There I go again. Small thumb, medium thumb, large thumb, standard thumb. This is just going to be the URLs. I don't really care about the heights or whatever. Channel title. Is that necessary? Yeah, I might as well. Right. We'll go channel title. Okay. What else do I want? Uh, tags. I could theoretically just dump all the tags. So, I should probably do that. I'm so lazy. I'm like, do we need the tags? Yeah, we kind of need the tags. All right. Category ID. I don't know what the heck that is. I don't really think it's that valuable. I'm sure, you know, you can imagine a world where category ID is valuable, but I don't, you know, I don't really know what that means, so I'm going to leave it out. Okay. Content details. Duration. That's going to be important. So, we'll go duration, dimension, definition. I could see you running some stats on that stuff, right? Like maybe in the future like you notice that most multiples are HD or something. That might give you some data. Okay. And then ultimately the stats are what we actually care about. So good god, look how far off this is. Okay. The way that I like to organize my databases is I just like to have like the most important information first and then I stick all the less important information later. So to be honest, we're going to need to rearrange this. Like are you going to care about the thumbnails for most of these? No, obviously not. So we're just going to dump these all the way to the right. What are we actually going to care about? If you think about the view count, like count, favorite count, and comment count. I don't know what the f count is, and I don't know why nobody favored that video. Granted, I did publish it yesterday, but can you all please favorite my video? I would love you a long time. Okay, so views. Just go views. We'll go likes. We'll go favorite. And then we'll go comments. Now, depending on whether this is normal, maybe we'll go title, views, likes, comments, favorites, and then if you think about it, like channel ID, channel title. Yeah, we don't need these either because it's kind of self-evident. Like it's good data to have just cuz it'll make my life easier when I build out the rest of the stuff, but it's not necessary. And then embed HTML. That sounds fun. Let's go embed HTML. Cool. So, I'm pretty sure now we have everything that we need, right? I'm just going to rearrange this by selecting everything and then double clicking on the um column tag. So, we have the ID, publish at, it'll say title. Then it'll say number of views, likes, comments, then it'll say favorites. Then it'll say the description, tags, duration, definition. Cool. Looks like a pretty good database to me. This is going to be our template DB from now on, right? So, just wanted to make it perfect or as perfect as possible. So, now what we have to do is we actually have to add it to the sheet. Actually, first thing we have to check if it exists in the database. My bad. After that though, we'll add it. So, let's actually just implement the adding functionality right now and then we'll do the checking if it exists functionality afterwards. So, we need to map it manually. So, I'm just going to fetch the columns by refreshing this. It's not finding it. Why? Uh, I think we might have to go. I think I might have to refresh this or something. Kind of annoying. So, I think I got to feed in the channel ID here. Okay. So the way that the append or update row works is there's an ID column that you actually match incoming queries in. So that's actually cool. We don't actually need to do the logic I just talked about because it'll just automatically find old entries and update them using the ID column which is incredible. So fantastic. Boy is Nad fun sometimes. Now what we have to do is we have to do this annoying laborious process of just mapping everything. So I'm going to map the title over here. Uh, now I'm going to go down to the views because I was a little presumptuous and I wanted the views to be first. I wanted all my viral videos to be first, baby. Okay, favorite. Cool. Now, we'll go all the way up to description. Uh, tags is going to be interesting if you think about it logically. Like, look, there's a bunch of tags here, right? So, how do you actually like get a specific tag in there? Um, you got to use string logic. So, just going to put tags in. And then you see how it's an array, right? Now, you just join the array um with some sort of delimiter. And now we have all the tags here. I like adding a delimiter and a space. I don't know why. I just I think it like works and looks better. It works better with more platforms. So I'm just going to do tags like that. The duration. That was pretty interesting, right? So where's the duration again? Okay. Right over here. PT39M26s. So h uh Okay. I'm just trying to think in my head. What's the simplest way for us to do this that works on arbitrary strings? Let's just open up the expression handler. So if I split this based off the presence of an H, what do we have? We have the whole thing, right? If I split this off the presence of an M, what do we get? Is it always going to be PT? Let's just do GBT40. PT3926S 39M26S. What is this format called? Let's see if we can get an answer from the lovely Galaxy Brain. ISO 8601 duration format. I want to parse this functions and turn it into the number of seconds. Simplest way. Let's see what it tells me. Parse ISO duration. It matches this PT whatever. Oh, I get it. So, it's actually extracting three types of data. The number of hours, the number of minutes, and the number of seconds. Uh, could I just match this inside of here? Let's see. So, this is redax. No match. Okay. So, yeah. Yeah, we got a match over here. Can I just copy this? This would be sick if I could just copy this. Maybe I can. This is going to look like magic if you don't know what Rejax is, but it's actually pretty cool. Yeah. So, it just did that. How neat is that? Okay. So, now if I want to get the duration, what I have to do is matching globally? I think it is. What I want to do is it looks like this array will always have four elements. Okay. It's always going to have pt what? It's going to have the full string first. Then null. the number of minutes. And seconds. If you think about it, I just multiply the number of minutes by 60 to get the number of seconds times minutes. Uh, right? So 39 * 60 would be the number of seconds. And then 26 I just add that. So I'm pretty sure what I have to do is I think I have to add this is going to be tough. What's the simplest way for me to do this in such a way that like isn't super complicated? I mean, I could just use a code node. I'm just trying to stay away from code nodes because I want to keep this really simple. Okay, so in an array, you can index it with square brackets. So zero would actually select the first element. One would select the second element. Two would select the third and then three would select the fourth. It's zero indexed, right? So actually goes like there's only four elements. So if I put four, it's selecting the fifth. We can't see it. But three, we can. So what does this mean? If we want to get the total duration in seconds, I basically grab the total number of seconds, right? And then let's just actually yeah, we kind of have to do code here. But anyway, I'm going to grab the number of seconds. I'm going to add it. Oh, it's not allowing me to add it because it's a string. I think we have to go probably two number here. Then we'll go two number. And now it should be 52. Okay, cool. So, what we do next is we go two, we multiply this by 60. That's the total number of seconds in the video. 2366. If you think about it, you kind of need to do the same thing for 1 * 60 * 60. And voila, we should have a relatively consistent way to always get the number of seconds the video. Sanity check here. Let me go back here. 3926 39 * 60 + 26 is 2366 which looks pretty good. Okay, so we did end up doing a little bit of code and I'll be honest, it's not very pretty. This is um probably one of the uglier expression fields that I think I've made in my life. Uh it's not very maintainable either. But what this is doing logically is this is using a regular expression to parse out three fields. the number of hours, the number of minutes, the number of seconds, and then it's just saying we're adding up seconds plus 60 * the number of minutes hours. Okay. All right. Now, everything else here should be pretty easy. So, we're going to go definition. That seems good. It's for small thumbnail. I'm just going to feed in the default URL here, medium URL here, large URL here, and then standard URL, which I guess is the biggest. Oops. Um, I think I just deleted a field by accident. Channel ID was next. My bad. Channel title. So, channel ID was uh what? Right over here. Channel title was right over here. And all the way at the end, embed HTML was over here. Okay. Good god, that took forever. Should now have everything we need, right? I think so. So, let us cross our fingers and add. Going to go over here, click test step. It's not adding. So, okay, it did end up adding. Very cool. I don't like how it bumped it and made it really big, though. That's uh going to make looking at my database a pain in the ass. So, what I'm going to do here is I'll select all. I'm just going to drag this column to approximate what I think the normal size of a column is. That's a little short. There you go. So, now all future columns will look like that. Also, uh I'm going to rearrange this. I just double tapped on it again. It's a little big. I don't like the description being that big, but I do think it's important that I can like read it at a glance. So, I'm just going to rearrange the description. Rearrange the tags as well. That looks fine to me. Duration looks good. Small thumb, large thumb. If I just copy this, paste this. Am I going to get the thumbnail? I will. Wonderful. Got the frame as well. This is just something you can embed in your website, which is kind of neat. But yeah, I don't need the whole thing. So, I'm just going to make this a lot smaller. Okay. So, yeah. And now we have the database template. And if you think about it logically, what we can do is we can just duplicate this, right? Just duplicate this over and over and over again for every video. So if I go back here, uh we've now done two things in one shot. We've actually automatically checked if it exists in the database with the append logic. And then if it does exist in the database, then we don't add it. We update it. And then if it does it Yeah. So we've actually finished this first system a lot faster than I thought we would. Very cool. All right. So, what I want to do now that I've tested this on one item. Um, basically anytime you're building any sort of NAD system or really any automated system, test on one item first and then once you're done testing on one item, test on basically like two plus systems. This is sort of like your I don't know if you want to call it like your order of operations, but one is simple, right? Because it's very easy to get up and running with one example like we did a moment ago. Now, we need to test it on two examples. Odds are if something works on two examples, it's going to work on like n examples where n is the total number of examples that we're feeding it in. Like if it works on two, it's probably going to work on eight. If it works on eight, it's probably going to work on 3,894. Like that's just, you know, a programming thing. Logically, when you go from one to two, what you do now is you implement loop functionality. And that's one thing we have to verify. So we figured it out for one. Why don't we now try running this like on an actual practical test for two? Notice how everything right now is pinned, though, right? So I'm going to do is I'll go all the way back here, unpin. So, I just pressed P. And then this Google sheet here. Should I delete this? Yeah, I'm just going to delete it. Okay. And then I'm just going to save this now. Always save. And then where it says limit, just going to do the limit to two. And actually, I'm realizing that I think we can just set the limit here, right? No, here. There's no need to do this. That was silly. So, what I'll do now is I'm actually just going to um do the limiting directly in the YouTube node here. Do two. Okay. And I'm not just going to test one. And I'm going to test all because um this is where the looping logic comes into play. Now it's adding or pending. And it looks like we got two. Cool. So let's just verify everything here worked fine. What is the duration of this video? That's like one pretty complicated piece of logic I implemented. So let me just double check. It's actually 1,628. I pretty sure it is. So 27 * 60 and then 08. Yeah, that's true. That is actually it. Cool. Yeah. I mean, you know, I think we did it now. We've verified the test. Uh I guess there's one more test that we need to do. If you think about it logically, like we've tested that it works on one channel. So now let's test that it works on multiple channels. Okay. So I'm going to do now is I'm going to copy the ID that I have over here. And then I'm just going to do it for another channel. And then if I run into issues there, I'll, you know, figure out the issues. So who whose other channel? I want to do a Leonardo Gregario just cuz this guy's one of the nicest. Okay, there we go. Okay, so copying this now. What am I going to do next? Well, I'm just going to add it to my channels thing. Paste it in. If you think about it, I have my SOP, right? Like for all the channels I paste in. I'm just going to duplicate this now. And I'm just going to go over here. I'm just going to paste in the new ID. And uh I'm just going to like delete both of these. Delete this. Now when I rerun this um what I should do is if you think about it logically I should grab both of the rows from I should then get all the videos for the person then I should get all the specific videos of the all the videos ids and I should filter out all shorts and then I should add or append to the sheet and it should go logically newest to oldest. Okay, I think this is his ID. We're going to give it a go. So it's reading the sheets 2 44. I mean, mathematically looks good to me. Now it's updating all four. No, it didn't end up working. And why? It looks like we just dumped all of Leo's videos into my channel. H, ain't that a metaphor for life? Uh, okay. I think I know where this happened. The appender update right now is probably hard- coding me, right? I'm feeding in JSON. nippet. channel ID. What I think I need to be doing is changing this dynamically. Right? So, right now we are feeding in this channel ID here. No, this should logically be working. This channel ID should change. All right. Should change. So, let me just check the JSON. Now, I'm checking the JSON of the entries. Channel ID here was UCBO. Whatever. Okay, cool. That's fine. This other one should probably be UCBO as well cuz that's me. Okay. Now, for this one, that should be not be mine. Should be UC8. Yeah, UC O B. So that looks good. Is it not finding mine? Maybe H. Maybe I just copy this now. Paste this in. Is this the same thing? Yeah, this is the same thing as this thing. Not really sure where this issue is arising from. H. Well, there may be like some builtin logic, a built-in that prevents it from iterating. I've seen this happen before. I feel like this has actually happened to me before where Naden does not have the ability to do this, believe it or not. So, what I the way that I saw this before was I added a bunch I created a bunch of new Google Sheets, one for channel. That's not going to work now. at all. So, I'm going to have to find a new solution to this and I'll have to do it live. So, what I'm going to try is NAD has an additional piece of functionality. This is a bug to be abundantly clear. This should not occur. Logically speaking, we're feeding in new channel IDs to a sheet and that variable should persist should not persist. It should reset at the beginning of every uh loop. But for whatever reason, it's not. So we have to do is we have to feed we have to take a fundamentally different approach. I'm going to do this approach using the loop over items um node. Okay. I want you to know that stuff like this is going to happen. It's important is that you just don't freak out get super emotional about your system just not working cuz if you do you know the likelihood that it will continue to not work is much higher than if you don't. So, what I'm thinking we're going to do is for every item that comes in, we're actually going to loop over every individual item. Okay, the loop over items batches is basically a place uh a node that where you can add a loop route and then done route. The loop route just is whatever you're planning on doing and then the output feeds back in. And then basically for the number of items you feed, in this case two, as we see, um it'll run once and then it'll loop and then it'll run twice and it'll loop again. So, what I want to do is I want to see if me adding a loop over items node changes anything here materially. So, I'm going to go back here. I'm going to delete all four of these. Then, I'm going to see if I mean I might need to update the logic. So, I'm going to test this first. Then, I'm going to see if maybe there's some way we could reset the data cuz this, you know, it seems logical to me that we should be able to. Okay, so we're still dumping this in. We're still dumping everything, which is unfortunate. So, let's see why. At least now we can actually logically run through both the items that are fed in. Okay, so this was run one which will have two items which is for me. This is run two which will have two items for Leo. So the input should be the channel ID. Oh. Oh, I'm actually hard coding this now. My bad. Uh we should actually be able to dynamically encode this now, right? So maybe I actually screwed up here. Maybe I wasn't using a variable here. It's not over. It's not over just yet. We might have already fixed this. Okay. Now, what I don't like doing is what I'm showing you guys here where I'm constantly hitting the APIs over and over again. So, my recommendation is don't get yourself in this position to begin with because if you're constantly hitting these APIs, it's just a matter of time before one of them rate limits you and says, "Hey, you know, you've done this way too many uh way too many times. If you think about it, we're doing two API calls to this, another this per API call, right? It's like four API calls total. " Okay, I'm seeing um a missing parameter here. Now, this is good. It means that we're actually moving forward. Oh, what the hell's going on here? I got to re Do I have to remap all this? I don't really know what's happening there. I think what happened is when I changed this out for the snippet variable, it momentarily pulled it off and then when it pulled it off, the variables here just stopped. They like disappeared. So, that's not good. I wonder if that's going to happen every time. This took me a quite frankly stupid amount of time, so that's annoying. Um, but I'm just going to cut through this to save you all a little bit of time. Okay, I just ran this with a subset of the data so that I didn't have to remap it all if and when it inevitably broke. And it worked. So, we actually got it. As you see, we have one here and then another here. So, what I'll do now is I'm just going to fill out the rest of this. Pretty stoked about it, though. I knew there'd be some issue and I'm glad that we got to work through it live. Okay, this has taken me a fair amount of time to do. So opposed to trying to work it out logically, I'm actually just going to paste the code that I'm currently working on to try and recreate that duration match directly into chatbt and then say building this inside of N8 TLDDR. I'm processing and the ISO 8601 duration code and trying to turn it into a number of seconds. Here's what I have above. Debug why it isn't working. also need it to work even if there are no elements found. Let's try that. H I see. Well, that's very good. They're giving me a little snippet of code here. I don't know if this is true. H does look very good. Yeah, that is the number of seconds that I was looking for wrapped in this little function execution thing. So, it can be used directly inside of NAD. It's not as good as a code block, but this allows me to not have to use a code block. So, okay, I think I'll leave it there. This seems somewhat robust. I don't know for sure. Sometimes AI code just blows, but this is enough for me to actually run the test, which is what I care about. And then instead of me worrying about like whether or not it's perfect or complete, I'm actually just going to run the test and I'll let the test tell me. Okay, so let's test it. And you know, in reality, your systems won't cover all edge cases. The idea is that they just cover most edge cases. That's number one. That's my channel. And this should be Leo's channel now. Yes, looks good. I'm not seeing any tags on his videos. Why is that? Oh, wow. He must just not add tags. Oh, dude. You got to add some tags. Just make a note to text him. Bro, you got to add tags. All right. I'm sure he's going to find that pretty funny. Let's close this up. And then Yeah. Okay, cool. So, we've now done that first section. And if you think about it, that's actually all we need to do because we just tested that it works on one. Uh, then we tested that it works on two. So, we should now be able to do this on basically an infinite number, assuming that we don't rate limit out and stuff like that. It's a problem that some people will have. Next up, what we have to do is we have to do a daily digest. What this daily digest was going to do is it's going to grab all the data inside of that database of ours. Okay? So, it's going to list all of them. Then, for every sheet inside of our database, it's going to go through then it's going to get us all the videos. And then what we want to do is for every video, we want to calculate the average. So, for every list of videos, like this is a list of videos here, right? We want to calculate the average number of views and then we're going to use that to determine the multiple of the new video. And then if the multiple is over some threshold, aka the threshold that we define, which I think I'm just going to do like 2x like you know in multiple detection if you think about it logically there are a variety of different ways you can do this. You use this like 2x you like 5x like 10x 2. 5x. You can have it like change with time. You can define it somewhere else. I you can do anything that you want really. I'm just probably going to do 2x cuz that seems simple to me. And then I'm going to include it in some sort of like daily email digest. Sounds fun. Okay, so let's build out the logic for that. That's going to be another NAN function or workflow. So I'm going to go back here. The way that I like to organize these is I like to tag them one. So in course and then also I like to do them same way I used to do them way back in the day on make. com. I used to go title YouTube trend detector ad update and then I'd go to YouTube trend detector. This would be daily digest. All right. So, these two are separate, right? And I'm running this manually right now, but if you think about it, realistically, you should be running this on some sort of schedule. So, we should add the schedule trigger to this instead. And then you can add multiple triggers. So, now it's scheduled technically if I turn it on. Schedule I'm going to add is let's just do one. Let's trigger at I don't know 6 a. m. or something like that at minute zero. So basically now every morning at 6, assuming that I turn this on, this is going to run and then it's going to proceed through the rest of my flow and just, you know, get that first bit of work done, which is nice. And just because there's nothing on the done loop, I'm just going to add this here. This isn't the prettiest, but I Well, yeah, I think that looks okay. Notice how this was completely unnecessary if there were not bugs in NA. There were bugs in NAN which prevented us from doing this just because some of the data persisted when it probably shouldn't have. But you don't actually need this loop over items. Maybe future versions will solve this automatically. Okay, so let's do the daily dig. First thing we need to do is we just need to grab all of the data in all the sheets, right? So, this is going to be kind of tough to do. I don't actually know how we're going to do it. Um, what are we going to do? do, ladies and gentlemen? All right, first thing we're going to do is we're going to grab all of the rows in the sheet. So, I'm going to go over here to channels. Just grab all the channel IDs. That makes sense, right? Talking to myself here, right? Okay. We're going to grab this. I'm going to test this. We're going to grab channels. Very cool. All right. I'm going to pin this. So, now we're going to have some pinned outputs here. Now that for every channel, what do we have to do? Well, we want to get all of the uh rows in the sheet for the sub channels. So, what am I going to do? I'm going to add my credential. And we may run into the same issue that we just ran into, by the way. So, YouTube trend detector database right over here. The sheet that I'm going to feed in is I'm going to use the name. And then I'm actually just going to drag this in. Now this is going to be the expression node. So now what we should get logically, okay, is we should now ping this twice and then return four items in total. We're feeding into we should get four because um every uh sheet currently has two. So one, two for me, one, two for Leo, right? So I'm going to test the step and we'll see if we get it. As always, I'm like expecting to get a certain output and I'm actually just rating what I'm expecting to get against what I actually get. Okay. So, the way that I like to see the data is using JSON. It's just the easiest. So, he quit his job. That's good. Fix 90%. That's good. He quit his job. Fix 90%. Notice how this is run twice now on the same um ID. Okay. So, this is the same problem that we were running into before. Logically, we're going to have to find a way to solve it. So, let me think where are we going to add this? We're probably going to add the loop here. Delete the replace me. And then I'll delete this little route. And I think probably going to have to do is probably have to loop over this. So what I'm going to have to do actually is first I'm just going to run this so I can grab the data from the loop over items node. Okay, let's just test this workflow. Let's see what we get from here. We got a loop branch with one item with a channel ID. Okay. Very good. Now that we have this, I can actually map this individually. Now what I'm going to do is I'm going to let this loop over this. And now go and let's see if this fixes it. Doing once, twice. Okay. And now we have two runs. So first run was me. The second run is Leo. Very cool. So, we've actually gotten everything that we need. How cool is that? All right. So, where we at now? So, we got all sheets. We got videos in each sheet. Now, for each list of videos, what we need to do is we need to calculate the average. How do we calculate the average? Well, in n we're returning a list, an array. Okay. And in this array, we have views right over here. So, we should very easily be able to determine the views mathematically just by doing a little calculation. I don't know. Is there like a calculate node? I don't think so. create like a set node. Probably a set node, right? So, I'm going to add a few fields. I'll add views here. No, we can't actually do this. What we need to do first, sorry, guys, is we need to aggregate this, I think, cuz this is currently many items. What we need to do is we need to put it into one item. So, I'm going to aggregate H. It'd be really nice if we could just get the average. Okay, I think we might need a code node just to make it. There's a million and one ways to do this. I think the code node's probably the easiest. Because what we do is we're going to just aggregate all of this stuff now. So for con item of input. all, input. all just gets all of the items that are being fed into it. I think what I'll probably do here, first of all, I'm going to pin this output right here. Okay. Second of all, I'm going to go const. just say array equals input. all. I'm going to return array. We're going to uh I don't actually think this is going to work. Okay. No, it did work. Once we have an array, we're going to need to perform a mathematical function on set array. You know what? Why don't I just have AI do it as per usual? It's funny. Every time I'm like, I'm just going to code this myself. I'm like, well, I could do it myself or I could just have AI do it. So each entry includes a views parameter. I want to get the average each item in the input of each item's views and then filter so that I only output items who have view counts views over 2x the average. Okay, we're going to generate new code here. You could also ask anything else. You don't have to ask this model specifically. Just wanted to see if I could do this easily. So logically this is grabbing all of the input items. It's then mapping them. It's grabbing the item. It's getting the JSON. Then it's getting the average views. So it's doing what's called a reduce function to get the average. This is unnecessary to be honest, but it does it anyway. This is very proper filtered items. Cool. This is the multiple that I'm going to be using. Two. And it looks like it looks pretty good to me. I'm going to return this now. See what happens. No output data returned. Well, that doesn't make any sense logically, right? it has to return some sort of output data. The reason why is because logically speaking, if you have an average of two items, the average has to be higher than one item and then lower than the second item, right? It kind of makes sense. So, we should definitely have some sort of data here. It might just be because I need to reloop this. I'm just going to test this workflow out and just see if I dump it like this, what happens. No, it looks like we're feeding in the items and then it's not really calculating the code which blows. Your current code doesn't work. Come up with a simpler way to determine the average of all of the items and match it against that average. Let's try this. And then if that doesn't work, then I'm just going to ask chat GBT consulting the guy that made all this stuff up. That's funny. Going to test this. I'm still not getting any output data. So I think logically there's just like some silly issue here. I'm going to run this through chatubt. What did I do for aski? Okay, it's giving me interesting idea. Um, I just got to copy the code here. Okay, so we're now going to add a little bit of debugging logic. It looks like Now I'm going to open up my code editor. Go to console. Then I'm going to test this one more time. Says the average us is 4,456. Oh no, it actually looks okay. Oh, you know what it is? I think we're running this just once. Uh let's see here. Feeding two items. feeding in both of my items. Then we're calculating the average fees. Now, what are we calculating? We're calculating my average fees. So, 3,999 for the first item. Then 4,913. So, the average to this, it's logically 4,456. Cool. But no, it's not returning the items that have more. That's annoying. There must be some other problem here. I don't like it. Okay, so looks like we are feeding in two raw items. All view counts are here. Total views, average views. Okay, looks like it's adding up the views. H. It still seems to be doing it weird. I really don't like using the code blocks. So, I'm just going to cut that out. Use the filter here. What I want to do is say views is greater than. Okay, I ended up solving this with a simple filter node instead of a code block. And then I fed in this expression, which probably seems pretty intimidating to you, but let me walk you through it. So, we grab the previous node, this Google Sheets node here. Then we get all of the items in NAN. You can get an individual item using the dollar sign JSON syntax or you can grab all of the items by referencing the name explicitly and then using a doall. Then I'm using a function called reduce. Now what reduce does is it's just a simple well it's a unfortunately complicated way of just calculating the average. It'd be really cool if there was just like an average function or something like that. Maybe there is. Maybe I'm just making this all way too complicated. H I don't think so though. Yeah. No, I think you have to do unfortunate code. But um what this is doing is this is reducing then it's grabbing the sum and the item and then it uses this arrow function to just like add it up. So this basically just grabs the average and I'm dividing it by um the total number of items here. So this sorry this gets the total number of views. This divides them up. For instance, if there's 10,000 total views and that's across two videos, the average is 5,000, which is actually pretty close to what it is. And then what we do is we just feed that into a filter block and we say, "Hey, is the number of views of this individual element greater than the average number of views of all? " Pretty straightforward, right? Pretty simple. If so, and we return this, which is cool. And then after I'm done returning this, if you think about it logically, what do we have to do? Well, we want to accumulate all of these and then we want to send them out in an email, right? So, I don't just want to add the email here. I actually just want to like run through my whole thing. I'm going to add my email over here instead. But just for my own sanity, what I want to do now is I just want to test this filter out on all of the data. So, I'll go test workflow. Grabbing the data from the Google sheet, doing the filtering steps. Cool, cool. And it looks like on run one there was one item kept. That makes sense because out of two items, obviously one and an average will be kept. And run two, one item was kept as well. These would be our outliers for instance. And now we have those two items accessible to us in the done branch as we see here. And what we want to do now is we just want to make an email delivering these. Again, there's a million in one ways to do this sort of delivery. I'm just going to do it in a Gmail. So I'm going to do is I'll say send a message. And we don't need to loop the done. Just leave that over here. We do need to loop this though. Okay. So then I'm going to move this lower right here. This right over here. Simplest way I found of organizing the stuff. Maybe Oh, that's annoying that we can't do one more. H. Oh, na done. Why must you do this to me? Then we're going to add our credential. So, this is the same idea as before. You just click create new credential. Sign in with Google. So, I've already created a bunch of credentials here. So, I'm going to close this. Then, I'm just going to use my Gmail account number four. And hypothetically, I'm just going to add um send this over to my personal email. And then I'll say daily digest trending YouTube videos. email type HTML over here. Um, you don't have to do HTML. I'm just going to copy all this and feed this in AI. So, let me see if I could just go from here all the way down to here. I'll say above is a bunch of data on trending YouTube videos. Format this into a simple HTML email I can send. It's part of a daily digest. Oh, sorry. All I care about are uh let's see here. We want the title, the channel, the thumbnail, the video duration, and the multiple. Right. We should totally do the multiple. Uh okay. Well, let's just keep this for now and then I'll add the multiple afterwards. Multiple is really cool to have. Now, it's going to format this as an email. Hopefully, I'm just getting network connection loss. I'm not entirely sure. Might just be my hotel internet. Okay. Okay. Here we go. Here we go. I don't know what this is. What is this supposed to be? This is hardcoded, right? Oh, I don't like why this is how this is laid out. Okay. Daily AI video digest. Cool. All right. So, missing title. We'll put the title here. Channel title. Duration over here. Okay. Cool. We'll do the thumbnail here. Okay. Uh, hold on. This is just one, I should say. meant to say include the variables as that way I can just very quickly find and replace all the variables in a moment. Cool. Going to copy this now. Paste this in. Oh no, I don't want the each, man. Ridiculous. Yeah. Hey, we don't want each. That sucks. I don't Can we actually do that? Maybe we can. Fascinating. That would be pretty cool if we could. H I don't know if we can or else we don't know. We could uh I could do the logic here. But actually, let's just send one to start and then I'll worry about everything else later. Okay, we'll just send one to start. We'll go expression. Let's do this. Uh image source. We'll just do large thumb here. Title of the video will be I just want to um get something on the page. Basically, I just want to have an email sent so that I can very quickly and easily identify whether or not it's like a trash template. I don't actually care about all of this data too much. Duration. This kind of sucks, but seconds, I guess. And I'm just going to remove this each thing. So, we should just pump out one of these now. Okay. And then I only want this to run. This is running twice now. Why is this running twice? We need to aggregate these is what we need to do. That's my problem here. Uh I'm just going to aggregate all item data into a single list. Uh it's unfortunate because we have to actually run have to run everything here. So let's just do this. Um test workflow aggregate. It should now be my output. Cool. Very cool. And then now probably going to have to remap this. Yes, I will. It kind of sucks. Oh well, that's what it does. Let's just do the first. Okay, so we'll go source and then I'll go large thumb. Then here under title data zero. title title, channel title. Cool. And duration. Awesome. Now, if I test this, should send an email. So, I can go to my email and just see how bad the formatting is. Usually, the formatting is like not the best. Okay, that looks pretty good. Couple things that I don't like here. I don't like the size of this. Can I make this smaller? Probably. We could probably make this smaller just by changing the Yeah, we're using the large thumb here. I'm going to try using the small thumb for one. What else don't I like? I don't like the fact that it says this email was sent automatically with NAD. So, in NAN, you can change that. Just go to append nad attribution. Turn that off. Looks pretty okay, honestly. Fix 90% of your AI agency problems in 30 days. Okay, let's try it again. We go here. H. Oh, nope. That's pretty blurry. That was pretty blurry. I don't like that. Yep. That is a little small. Can we change the image source class thumb? Oh, you know what? That's the problem here. That's the problem. Let's just change 240 pixels and then we'll still use the large thumb, but we're only going to be at 240 pixels. Okay. And then what else do we really want? I guess we just want all of them. So, logically, how do we do that? What we have to do is basically for every item inside of aggregate, we just have to generate this. Why don't I just paste this in again and see if two emails look okay. This looks fine now because it's in 240 pixels, right? Yeah, it's not the best just to have them all laid out like this. Kind of wish we can go like uh like lengthwise. If I feed this in here and then I say just feed this into AI, let's do GBD4. 5 and I'll say this is an HTML template that's supposed to return a nice looking minimalistic list of higherforming YouTube videos. I put two as an example, but it should scale to infinitely many. Right now, this looks poor. This looks bad because they're stacked on top of each other there. And I don't like the formatting etc. fix this so it looks nice and clean and the videos are side by side in some sort of clean uh minimalistic but sleek grid pattern. Okay, there you go. We're going to see how that performs. We'll keep the two and then after I'll deal with the logic on generating multiple little video grid things. The thing is like emails just inherently lack like the ability to do some cool formatting which sucks. So, oh, sorry, I didn't mention this. Sorry, sorry. This is an HTML email. So, it needs to be formatted in light of that, right? Emails are formatted a little bit differently than um websites. So, it needs to be um tables instead. And it doesn't look like it's a table. Okay, cool. Anyway, it's going through and it's now creating me this little digest, which is nice. Okay, let's expand this little code block now. Hide the sidebar. And then what I want to do Oh, we can actually preview the output. No way. Got a little um HTML thing in here. Doesn't look very good, not going to lie. Oh, don't tell me it's using each. Please not use each. Damn. It's totally using each, isn't it? We just said no each. Yeah, it's doing each. That sucks. We just need to go TR now. I think is this. Okay, cool. That looks much better. Um, so now we have basically this nice infinite layout. So if I added more, we'd be able to do more. Now I just have to generate the code. 1. 7. My bad. Where does it do the 1. 9? Does it have a 1. 9? That'd be much better. 140. Uh, what's the width here? It's going like 170 probably maybe 180 20 on both sides. Right. Okay. Try that. And then I'm also going to change this a little bit. So we'll say Cool. Looks fine. Nice. All right. Cool. Cool, man. Nice. I'm liking this. Uh, this array. Um, why is the array the same though? That's a problem, man. This array should not be the same should have different items here. I mean, it looks it feels a lot better for sure. Let's test this now. This like still a really weird cut off here, man. That's not right. Thumb container object fit cover. Dimensions. What are dimensions? I don't understand. What are my dimensions? Height and width. It is 100%. Um so where is this being applied? Can I do like 160? Oh 170 180 160. I feel like it's 155. Okay. You know it's probably 155. So where is this? Go back here instead of 155. Send this now. That way it's not going to be super skinny again. Nice. Oh yeah, that's perfect. Got the whole thumb, baby. That's what I'm talking about. Okay, we can just set nixar if and duration whatever on the same line, can't we? Okay, so let's just uh get a list of things we want to do now. Is there padding over here? I don't know why there's padding over here. Just remove the padding. Okay, so we've now verified that this works on two. I was just feeding in examples of the exact same code snippet. Uh, aka like the same thumbnail and stuff. But now I want to make it so that it works with different thumbnails. So, I'm just going to jump in and actually make a little do a little snippet of code to handle this for me. First, I'm just going to trust that this runs on data. At least that's what I'm doing right now that I'm pinning. But, I actually want to like have it run on live data. So, I'm actually just going to test this workflow and see what's going on. Okay, looks like we've now sent one email and I believe it's going to be the same thing twice, right? Okay. No, no. We actually getting um we're getting data directly from So, that's actually fine. Uh looks like the sizing is a little bit off. I don't really know entirely what's going on with that to be honest, but uh yeah, these are actually the trends. So, to be honest, like it kind of already works. Um one big thing that I want that we don't currently have is we just I just want the multiple. So, that's one thing that I have to do. So, how am I going to do the multiple H? Well, I guess I could just put the multiple right next to the title, right? So, I think that's what I'm going to do. I'm going to go into the HTML template here. And then where I have the title, which is right over here, I'll add a span for the title. Then I'll also multiple. And for the spin, I'm just going to go style equals font weight. And then I'm just going to go bold. Then over here, I should have be able to come up with like a little multiple. Now, what is the multiple? Uh, well, the multiple is going to depend on this. So, guess I can just copy all of this. This is not at all clean. I'm going to be doing calculating like directly inside of the template, which most people do not recommend. Okay, but still. F it. We ball. Try not to swear as much. Let's think about this. What are we doing? I know what I'm going to do. If you think about it logically, what we need is we need this filter to open an additional field. We need to drag all of these in. Can it just automatically map? No, I can't. Okay. So, yes, I can. That makes sense. What we want is we want that average, right? So, I'm going to go to filter and I'm going to calculate the average. Then here I'm also going to say average. Feed that in as an expression. So now we're going to get the average and we're going to get this. So what this means is when we actually feed back to the loop over items, it's going to have everything that we need. I'm just going to undo this and test it for myself. It's going to include the average, which we can then use to find out the multiplier. We get everything we need plus the average. Wonderful. Now we connect this to the aggregator node. Now I can use this average to well sorry I guess I need to run this um kind of annoying but it's what it is. We actually have to run all this with the aggregator. My bad. So we're going to hit the APIs a bunch more times. Let's see if we get a rate limit issue. Nope. I'm just too crazy with it. We get the average. Wonderful. We're going to pin this. Now we're going to connect this. And now that we have this, we can actually go through the HTML template somewhere over here. Establish that average. So, let's do the math. What I'm going to want is this going to work. Does that work? I don't know. I don't know what I've been told. Yeah, I don't think we have the ability to do each. So, this doesn't really solve my problem. Oh. Oh, it does. It does. Okay. Okay. Wait. Uh, I'm aggregating here. I aggregate up there, right? Wait. What happens if I feed multiple items into this HTML node here? What happens? I don't know. Let's give it a try, man. Why the hell not? Um, We're not getting the data anymore. Where are we getting the data anymore? Jason data one title, right? Oh, yeah. Yeah, we can just do this, right? Delete that. Cool. So, delete the TDS. No, just get rid of that, too. Okay. So, where's this video table? I doing a table per video or what? I think we just deleted this And then we delete all that Delete that for sure. It says TR. We can probably just output them bunch of these, right? Let's go and do that again. Um, why is it doing the same thing? Cuz I'm returning the same thing. Well, I mean, this is what I wanted. So, now I just concatenate them, right? Yeah. So I just go here to the video table. Cut out of this. Now what I do is I do this. No, I do this. right? That's what I'm talking about, baby. We mapped the hell out of that man. Then, um, split. Wait. Then join. Then split with nothing. Oh That's all I do. That looks good to me, right? I don't know why we're getting two of the same outputs, but like it should be okay. a We sent it twice. Um, maybe we just run it once. Probably enough. Execute once. Yeah, sorry about that. Oh Where the hell's the thumbnail at, man? Why are we getting the thumbnail? Yeah, there's the data right there. Yeah, it's not rendering. That sucks. Why is it not rendering? Nope, that doesn't work. Why the doesn't that work? Isn't that the whole idea, man, that you can insert freaking variables like this? Holy this is brutal. I really want to do the HTML template thing if I could just do code to do it. See, we may have to just like fix all this man. I got all the stuff out. Huh? We need to delete this, right? Well, it's now putting the URL, which is cool with uh the X at the end of it. So, could I concatenate? Nice. That's cool. Source is this. Then I want to concat one more. Is that going to work now? Oh, for sakes. Come on. Chop. Nice. That actually did work. Cool. Very cool. Uh, all right. So, what are we doing over here? Video table thumb container thumb. So, this is a class of thumb. We'll go source class equals this thumb. Okay. Oh yeah. Okay. And then now that we're all done with that, what are we doing here? Okay, we now send one item. Hello email. my life. What the hell is going on? I think I know what it is. Kind of like a good thing going here for a bit. Like 15 minutes ago. Give me that email template, man. The hell were we feeding in here? Image source large thumb container on the outside. Are we still doing that? Image source im thumb container on the outside. Right. It's image source equals thumb concat class thumb. Looks good. I don't see any issues with that. So, why is the email coming out all A quick div checker online. Please go over here to Yes. Mhm. Oh, that's useful. Come on, man. That's just not nice at all. Why would you break on me? Um h container right video table right remember earlier when this is working fine not that one okay table tbody tr thumb container so it's trd dev trtd dev, right? Looks good to me. But for whatever reason, when we're turning these on, they break. And also, we're running the same data over and over again. Why is that? We should have new data, right? Like not seeing any output data on this branch. So why is this going two get four or filtering two and for the first run it's 90% so I don't fully understand what's going wrong here something is though we pumped out two channels good for both channels what we do is we loop through the Google sheet. Looks good. Then I'll put a filter. Filter keeps one, discards one. So now we have two items left. Two items on the left side. That looks good. right side. That also looks good. Loop one. Yes, we are actually getting them now. Wonder. We're now just going to create a new chat GBT template entirely. One that is much better looking than this. Let's go chat. Go over here. Change this to 04. First string is my HTML template. I am creating a simple daily digest app that includes trending videos from HTML. Above our HTML templates that are sent via email wrapper HTML in the second Still looks like Come on, man. What is going on with these black bars? Okay, just crop the images in point both sides. Height should be what is the height right now? Grab the height of this. Grab the height. Where's the height? fish care for the demo probably. Sorry. You would have thought that San Francisco would have better open AI access, huh? Apparently not. Oh my goodness. It's good, but that's too much. Okay, so here's what we're going to do. I'm going to find the specific snippet of code where it actually cuts into my freaking things. Okay, so height 155 pix. So that's way too small. We're going to do is we're going to say double. We're going to double it. 310. Boom. Is there anything else that's 155 here? No. This is good. Supposed to be good, but yeah, it's too much. Okay. Multiple undefined. So, we also need the multiple. Where's the multiple average? So, where we get the multiple, we're going to do is concat uh JSON dot um JSON dot views divided by JSON. a. Okay. All right. This going to be it. Oh yeah, sorry. We need to round it. Uh, how do we round this? probably. Okay, I'm going to run this one more time just in one second and then we should be good. Then instead of two or three, why don't I add Well, we should probably do like five or something, right? Should probably run this once before I start saying stuff. Have it screw up on me during the demo. Let me just make sure that can I still see my mic and stuff? Yep. No issues there. Looks good. We'll run it now once. Clean as hell. Nice. That's perfect. Multiple one. Oh god. One more. Y'all ever done this? Okay. any that around decimal places. Okay, we just need two. And that's that. Hopefully you guys

Outro

saw firsthand how to build this sort of system live. I love doing these sorts of builds because you guys can actually see the thought process of like a real AI automation developer, not just the fancy finished product at the end. If you guys enjoyed this sort of thing, then definitely check out Maker School. It's my 0 to1 accountability roadmap where I will literally personally coach you through 90 days of AI automation agency building with a list of daily tasks and pre-recorded Loom videos that show you and walk you through everything you need to do in order to get your first customer. And if you don't get your first customer, you get 100% of your money back. Otherwise, if you guys could do me a major solid, we just hit 100,000 subscribers. If you guys aren't sub to this channel already yet, please do. Every sub means a ton. Like, subscribe, do all that fun YouTube stuff, and I'll catch youall on the next video. Thanks.

Другие видео автора — Nick Saraev

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник