# The LinkedIn Parasite System (10X Your Followers with N8N)

## Метаданные

- **Канал:** Nick Saraev
- **YouTube:** https://www.youtube.com/watch?v=roTg7kyw8X4
- **Дата:** 16.06.2025
- **Длительность:** 1:26:17
- **Просмотры:** 39,947
- **Источник:** https://ekstraktznaniy.ru/video/11928

## Описание

Access the template in the video & Get automation customer #1 by joining Maker School ⤵️
https://www.skool.com/makerschool/about?ref=e525fc95e7c346999dcec8e0e870e55d

Watch me build my $300K/mo business live with daily videos + strategy ⤵️
https://www.youtube.com/@nicksaraevdaily

Summary ⤵️
This video shows how to build an AI-powered LinkedIn growth system using N8N that scrapes viral posts, rewrites them with a unique twist, and auto-posts daily to grow your audience hands-free.

My software, tools, & deals (some give me kickbacks—thank you!)
🚀 Instantly: https://link.nicksaraev.com/instantly-short
📧 Anymailfinder: https://link.nicksaraev.com/amf-short
🤖 Apify: https://console.apify.com/sign-up (30% off with code 30NICKSARAEV)
🧑🏽💻 n8n: https://n8n.partnerlinks.io/h372ujv8cw80
📈 Rize: https://link.nicksaraev.com/rize-short (25% off with promo code NICK)

Follow me on other platforms 😈
📸 Instagram: https://www.instagram.com/nick_saraev
🕊️ Twitter/X: https://twitter.com/nicksaraev
🤙 Blo

## Транскрипт

### Introduction []

Hey, I'm about to build a LinkedIn parasite system live in front of you that automatically scrapes high performing posts from top creators in your niche. Finds article information and unique twists to that content. Adds all of that to a rewriting flow to create a unique piece using AI and then even goes as far as literally automatically posting it to LinkedIn for you to make growth easy or let you sell listed people for 1,500 bucks a pop or more. If this is your first time here, my name is Nick. I scale my own AI automation agency at over 72,000 bucks a month and I'm now leading the biggest A automation community with almost 3,000 A automation freelancers and agency owners. I'm going to build this entire thing for you from scratch using Naden. The idea is to walk you through my exact thought process and also show you all the dead ends and the detours that most other people leave out so you guys can see what an actual real development process looks like. Let's get into it.

### Demo [0:41]

Okay. And here is a demo of the finished flow. So, what I've done during the flow is I created a bunch of documentation to make it really easy for anybody that uses this template inside of Maker School to just take it, make adjustments to it, and then be able to do the same thing that we're doing. Um, it's composed of four steps. There is first an initialization step, then a scheduled scraping of source posts, then a scheduled parasite flow, then finally a scheduled LinkedIn poster. Let me show you how that works. I'm going to be moving around this little trigger, and everything rests off of this Google sheet. So, the very first thing in the flow is we need to initialize to find a list of creators based off of a search term. So, I'm using the search term coding because I want to find content for people that are posting about coding. So, I click execute workflow. The first thing that happens is we go and we actually scrape using a third party scraper called Ampify a bunch of LinkedIn posts based off of, you know, a search term that we're providing. In our case, the term coding. So after the search, we end up with a list of, in our case, 200 items here with posts. And these posts have included a bunch of likes and comments and shares and data surrounding everything that we need in order to get the viral creators, the people that are consistently going viral with a little filter. So what we do after that is we filter it down and then at the very end we have a filtered subset of all of the high-quality creators that are posting about stuff. And so in our case we ran on 200 posts. We ended up with six high quality creators that you know received 216 likes on the source post. 417 and so on and so forth. Okay. Now speaking of source posts, what do we do next? Well, we schedule a scraper that will run once however long we specify. In my case, I went to the schedule trigger and I just said once a week to go and scrape the most viral of the source posts from creators. So, I just move this little schedule trigger off here and just move this back down. What we're going to do now, if I click execute workflow, is it's going to read through this sheet, grab all of these people, and pump this into another set of scrapers that then find specific high-quality posts that these people have made. Once that's done, what we do is we add them to our source posts sheet. Okay? And this database, this tabular format includes things like the actual content itself. Includes the post URLs of the images and so on and so forth. After that, we actually start our parasite flow where we feed content into AI to have it provide highquality and not just like regurgitated, repurposed stuff, but actually highquality AI generated AI research stuff with twists. So after we execute that, we read all of that source data that we just pumped in here. Okay? And then we pass it through AI to find a bunch of web data, articles and whatnot that are related to the search, not necessarily the same thing, but are related to that content. Then we actually feed it into AI to have it analyze the image that is provided. We then generate an outline with an image description of the specific content so that we can then regenerate it afterwards right over here. After that, we then append it to a Google sheet. And then we have some little logic to prevent rate limiting and then ultimately cycle through this flow over and over and over. The generated content is highquality. It is written in my own personal tone of voice. It's not the same as the source content, but it takes inspiration from what the source content is talking about. Odds are these creators are obviously talking about things that their market wants that are very relevant. So this is a quick and easy way that you could repurpose the same ideas, but have a different tone of voice and ultimately have different quality content. And then after this goes on, you'll notice the Google sheet has a post status of draft. Well, that's where this final flow comes in. If I stop this, we can also schedule an automatic LinkedIn poster. And all that LinkedIn poster does is it goes through this sheet right over here, identifies posts with status draft, then it goes through and it actually publishes it live on your LinkedIn. And by scheduling this to run once every day, what we can do is we can build a fully autonomous LinkedIn posting machine that posts at whatever cadence you want. So that's what we're going to end up building. Just going to delete that cuz I think all the testing I've done has pissed off a bunch of my followers. Let's get into the actual build itself.

### Live-build [4:36]

Okay, so here's what I'm thinking for the system. My rough draft is first I obviously need to find and test LinkedIn scrapers. I think I know where I'm going to go for that. Hint hint, Appify. It's the next tab open, but I'm not entirely sure. Then I'm going to compile a list of LinkedIn creators that publish in my niche. I'm going to pick like, I don't know, videography or something like that so that it's not all AI automation and not very incestuous. But after that, we are going to build a system that filters by the following or the size or whatever the hell we care about. Then I'm going to add it to a database. Now, I want you guys to know every time I say database from here on out, really all I'm referring to is a Google sheet. And that's all that a Google sheet is. It's tabular data, which just means you arrange things with headers and then columns and then rows. So, anytime I mention database in this and basically any other video, you can just think of it as a Google sheet and you'll do just fine. So, after that, basically, I'm going to set a schedule that scrapes their most recent posts from the, you know, list of creators, adds it to a second database. Then, for each post, I'm going to pass into an AI prompt that will do a little bit of researching and it'll tweaking. Like, whereas most other parasite systems just copy and paste the same thing verbatim, I think that whole thing is kind of low value and it probably doesn't really resonate that well. What I want to do is I want to research the content of the post and then I actually want to tweak it a little bit and then I want to use that to feed that back into AI and you know have it emulate my tone of voice before adding it to a third database where I'll then post on LinkedIn using some sort of page credentials. This over here is optional. The LinkedIn posting setup just historically can be a little bit tricky. But yeah, that's in a nutshell what I'm thinking of doing with the system. It's very similar to a system that I built for an 8 figureure content company. So pretty stoked about it. Why don't we get started with the very first and I'm just going to change my color to teal. And then every time I'm done with a step, I'm just going to knock it off. From there, we'll obviously also, you know, build everything on the Nadine canvas. But before even starting with the NAND canvas, I just first want to like verify that I can do the LinkedIn scraping that I want to do. So why does it say find and test LinkedIn scrapers versus like actually build a LinkedIn scraper? Well, LinkedIn and other userenerated content platforms like social media platforms. They typically have extraordinarily strong anti-scraping protections. And so rather than me spending all this time up front trying to rebuild the wheel, I'm just going to go use a service that I know for a fact works pretty well for this sort of stuff. And so what I'm going to do is I'm going to use Ampify. Appify is basically just a big marketplace where you can just post a bunch of scrapers and website developers and scraping pros and stuff do this and then they charge you a little bit of money for it. And because of this, it's just like a big capitalistic network of, you know, scrapers that compete. So, if I just type in LinkedIn, you see there's actually a bunch already. There's a LinkedIn job scraper up here, LinkedIn post scraper, mass LinkedIn profile scraper, LinkedIn sales navigator scraper. I mean, anything you want, you can basically scrape. And it's not just LinkedIn, it's also like Instagram, like Facebook or whatever. If anybody here still using Facebook, God bless your soul. You know, today we're going to be doing LinkedIn. And I'm not really sure which one's going to work. I think I'll probably use one of these. Don't really like how the pricing is 30 bucks a month. So, I'm just going to switch to pay per result. And then we're just going to do Oh, wow. That's really interesting. It actually scrapes emails. What I want to do is I want to do posts because if you think about it, what I want to do is I want to be able to like pump in a search and then um have that search deliver me some, you know, LinkedIn thing. So, this is what it looks like. Profile username or URL. Looks like this is optional. It's not optional, actually. This one's required. I don't really want one that's required. So, why don't we do this? $50 per,000 results. No freaking way. Maybe this one. I want some sort of like LinkedIn search. Okay, let's do that. LinkedIn search. Okay, so we can actually scrape LinkedIn posts by doing a search. $5 per thousand results. That sounds pretty good to me. Okay, so here we go. Now, I can type something in. This one actually says coding. Maybe I'll do coding. So, why don't we do coding? Sort type will be relevance. page number. I don't really know anything about the pagionation here, so I'm just going to roll with it. Click save and start. What I'm trying to do is I just want to make sure that I could do this um on this third party scraping platform before I worry about moving it all into NAD. So, I'm seeing it says success here. Everything went according to plan. We scraped 50 outputs. And what am I getting here? Uh if I just open this in LinkedIn, what exactly am I looking at? Danielle Mocha, writing code is not equal to software development. Coding is small, broader programming. Okay, cool. So, obviously this is a content creator. Very sweet. How about the second one? Third one, fourth one, and fifth one. Let me just verify that we're actually getting popular posts here, or these just random posts. Okay, actually, we're not getting popular posts here. Looks like that first one was just an anomaly. That's fine. Now, we have a way essentially that we could scrape posts, which is pretty cool. So, I'm actually already done. Like, this is fine for me. Do I know that this is the best LinkedIn scraper out there? No. But usually what I'll do is I'll just try something that works and then I will take that working MVP and then I'll move it forward and then at the very end of the system, if I want to make optimizations, I can do it then. But there's no point slowing myself down right now trying to find the best LinkedIn scraper on a cost basis or whatever. We're just going to pick something that works for now, move forward, and then once we've actually built out the whole system, we'll know more about what we need to do to make this better. Okay, cool. So, the question is, we can obviously scrape, you know, a list of posts or whatever inside of Ampify, how do we actually take that over into NAN, actually get ourselves working with something? Well, there are variety of ways. Basically, inside of NAD, unfortunately, they have no built-in API node yet. So, you have to do an HTTP request and manually set up the API call. This may seem really scary. Don't worry about it. I'm going to do it really logically, just step by step with you. First, what we're going to do is we're going to look up the Appify API docs and I'm going to go over to the API docs. If you guys are like looking for any service, just type in service name API docs. If they have an API, they're going to have a docs page as well. What I'm seeing here is I'm looking for authentication. Okay, so always the very first thing I do with any API is I go to authentication. Why? Because if you can authenticate, 99% of your work is done. So, you can find your API token the integrations page. It's hyperlink, which is really useful. I'm just going to open this up. And over here, you can see I have my API tokens. I'm actually going to delete this one here cuz uh no wonder my other demo ran up my account. Somebody clearly stole my API token. This one here, uh let's just call it like LinkedIn parasite system. Let's create that. Okay, cool. Now I have my API token. It's nice and long or whatever. So, what do I do with this? Add the token to your request authorization header as bearer token. Hm. I'm not really sure what the hell that means, but that's okay. Let me show you why you don't even need to know this in order to interact with APIs. All you need to do is you need to know the endpoint that you're going to call. So what do I want to do logically? Um I just want to recreate what I did uh a moment ago in Apify manually using the dashboard. I just want to do an NA. And what did I do in Appify a moment ago? Well, I ran the thing and I got a big list of results, right? So is there anything here that allows me to run the thing and get a big list of results? Well, I see actor runs over here. Run actor. run actors synchronously without input. Run actors synchronously get data set items. So, this is probably what I'm going to do because it's saying that I get the items. I'm kind of cheating here a little bit because I know how Appify already works. I've recorded a million videos with this, but this is the one that I want. Run actor synchronously with input get data set items. If I didn't know which one, I would literally just try a bunch and then I would find the one that delivers me the output that I want. So, what's really cool about N is you can copy the curl. So, top right hand corner, I just find whatever is curl. If it's JavaScript, that's a no. We want curl CLI. Um, many different names here. But you just copy this and then you paste this into not this system. This is a Oh, I guess I opened up a couple systems called LinkedIn to see if I had anything uh that was similar to this, but that's not what we want to do. do is we want to go back to our main system here, LinkedIn parasite, and then paste in the curl command. I'm going to import this. Now, the really cool thing about NAD's import feature is now it just takes all that data, which is previously string, and then maps it into all the fields we want. So URL is mapped, post is mapped, and then we could see where this authorization thing kicks in. You see where it says bearer, and then it says less than token greater than. Well, this is where we paste in the token. So I'm just going to grab the LinkedIn parasite system token and alt tab, and then I'm going to paste it. And voila, we now have our input basically, which is cool. There's one other field that we need to do. Um, it looks over here like there's this colon and then it says actor ID. The way that like uh APIs usually do this convention is when you have like a colon and then you have some thing. This is actually just like a placeholder for the actual ID of the thing. So the way that I always get IDs is I always just check out the URL of the thing that I'm working with. And this isn't just unique to Appy, by the way. Like the vast majority of services will have the specific thing that you want to extract as an ID inside of the URL. Works for like 90 services. So console. appify. com/actor/ some big ID string/input. Well, odds are this is the actor ID, right? because it says slash actors. Well, which actor? Okay, great. Slash input. Maybe after this there'd be another ID, which would be the input ID. But in our case, this is obviously going to be the actor ID. So, I'm going to paste that in. Let me just see if there's anything else we need to fill. Looks like it automatically did some redirects and stuff. So, if you think about it logically, like what did I just do? I just input an actor ID, but we also have some search keyword here that I fed in. So, usually the way this works is you feed in some JavaScript. Uh, and I see here there's a manual, there's a JSON. So, it looks like this is just the only thing we feed in keyword coding. So, I'm just going to copy this over. Then down here at the very bottom where it says JSON using fields below, I can actually just go keyword coding. Or, you know what I'm going to do is I'll just go using JSON. And I'm actually just going to paste that in. See? Okay. So, now what I'm going to do is I'm going to execute step. Let's see if there are any problems. Crossing your fingers. If we can do this, we've already accomplished the first major step, which is just compiling a big list of posts. Okay. So, how many did we get? Um, I'm not seeing only seeing 50 here. So, can we get more than 50? Can we go 51? Oh, can we only get 50? Okay. So, this one here looks like we get more than 50, but it costs a little bit more, but we can actually set 100 as the search. So, why don't I do like 200? Why don't we save and start? See what happens. Uh, if we can get 200, that'll be way better because obviously the more that we have, the fewer times we have to reinstantiate the whole flow. This looks pretty good to me. Damn, are we doing a lot of scraping right now? Let's go to the app. Let's see what we got. Is coding really dead? H, a very relevant topic. Okay, we are getting lots of fields and we are getting tons of posts. So, yeah. Yeah, this is this one's better. Okay, this one's better. So, uh, now that we have this one, I'm just going to update the ID of the actor. So, I'm just going to go up here to where I have blu. Go over here. I'm just going to update this. And then this looks like it's it finished and it did a pretty good job. So, I'm actually just going to go back here. And then what I'm going to do any other Oh, yeah. Sorry. There's one more thing I have to do. I have to go back to the JSON input. And then I just have to grab this. And down over here, I just have to replace it with the new. I actually don't really care about comments. So there's a scrape comments false scrape reactions false search queries coding. That looks fine to me. Let's just execute this and see what happens. We are filtering a fair amount. We're saying that the expression evaluated to a falsy value a. 2. So I'm not really sure what that means. I don't think I've ever seen this one before. Let me just try undoing this. And let me just make sure that this ID that I'm dealing with here is the right one. So blu. That looks good. JSON that I just fed in. How about I try this? So maybe there's an issue with the JSON. Feed that in. Not seeing any major changes, but maybe. Got my token there. Okay. So why did the other one work and this one didn't work. Okay. No, maybe it was some issue with what I was doing cuz now it looks like it's executing. Cool. All right. So we just get rid of that old crappy one and then let's Yeah, let's see. So we just started a run, which is cool. So that means that it's currently running. It's currently grabbing us some data. Fantastic. All right. And let's take a peek at the data. Oh, we have a type post ID, LinkedIn URL from Danielle. Cool, cool. Do we have reactions? We need reactions. Likes 240. Okay, cool. This is good. So, how big are the people that we are scraping? Why don't we add a sort just really quickly and I'll just do this internally as my own mechanism. Let me pin this. I'm going to use the sort now to sort based off of Sorry, where was the like? Good god, does this woman add a lot of images. Okay, engagement. takes and we're going to sort it by descending. I'm doing this because I just want to get like a sorted list of everything. It says the expression evaluated to falsy value. H weird. I'm thinking because I've gotten this a couple times, maybe there's some issue with my NA instance. So, I'm just going to refresh this. Maybe it'll solve the problem. Every now and then we get unexplainable or inexplicable bugs and better not to allow them to ruin your whole day. Destination node not found. I think this might be happening because I pinned the data previously. Let me just make sure that there are no issues here. Engagement. Descending. I'm feeding this in. Okay. Not sure what happened there. I feel like I may need to rerun this whole thing, which is kind of annoying. I don't want to have to rerun this, but think I'm going to have to. So, let me just unpin all these and then try executing the workflow again. What exactly is disabled is my question. Odd. If I maybe add a new trigger, will that work? Execute new workflow. No, we're reading properties of disabled. Odd headers. Okay, that looks good. JSON looks good. Very inexplicable kind of bug. Maybe I'll try turning off the redirects. Giving that a run. No. So, here's what I'm going to do. I'm going to because I think this is with my HTTP request, right? Like imagine if I just sort this. Yeah, it works. So clearly there's some issue with my HTTP request. I don't know what the issue is, but I imagine there's something. So I'm just going to now manually remap the data. I know it's annoying, but I don't want this to slow me down. And I want to see if maybe there's just some issue with this initial HTTP request. There's something that I'm not seeing. You guys might actually be able to see it on your end. I'm not really sure, but I definitely don't see anything. So I'll just go back here and map all the fields in. That one says authorization. So we'll go up here. We'll go authorization. Paste in the key. Do we have any headers? Oh. Oh, sorry. These are supposed to be headers. My bad. So, actually paste this in here and then this in here. And then we can get rid of these. Actually, not going to send query parameters. We are going to send a body. And then I'm just going to use JSON. I'm going to go back here. I'm going to copy this in. And actually, before I paste this, maybe there's an issue with the JSON. So, I'm just going to paste it in here first. And that's just a little formatting tool. I'm going to go back in again. All right. Okay. So, I'm thinking this is going to be okay. I don't know. We're going to execute it. I'm executing this independently. I haven't hooked this up to any sort of execute workflow node. You can do that in NAND. No issues here. And I've isolated it. So, that seems like it should be fine. Okay. So, I got results back. That's nice. Should be able to pin this now. If I execute this, will this work or will this be broken again? No, that looked good. So now just going to delete this. Go to my sort node. The same thing I wanted to do before. I was sorting based off of was it engagement. And I want to sort by descending. Execute this. Okay, so we've now executed that successfully. Awesome. And it looks like the highest number of comments or whatever likes is pretty high, 2,000. So that's fine with me. Um, now that I've verified this works, why don't I add a filter? And what I want to do is I basically want to go back to that. I think it was engagement. likkes, right? So, let me just go JSON. engagement. likkes. Yeah. And I just want to basically check, hey, is this above, you know, 100 or something. If it's greater than 100, good. I can add the creator to my pool. So, let's see. Of the 200 that I fed in, how many are greater than 100? Uh, 14. Cool. And what I'm going to do now is for all of these, I'm just going to add them to a Google sheet. So, let's append a row to a sheet. YouTube. I'm going to do this one here called LinkedIn parasite system. So, can we find it? Yes, we can. Oh, actually, I should probably create the um columns. And then what am I going to do here? I'm going to go creators. What fields do we actually have? Author. Author is what I'm going to use. So, let's do LinkedIn URL. That looks pretty good. That'll be my main field. It's going to be different for every individual one, right? And then we'll go name. Feed that in there. That'll be useful. This is the headline. So, we'll feed in the headline. We'll go um posted at. And then we'll go engagement on post. Maybe we'll go like likes on post. Why am I doing this way? Because I just want to verify that I'm not going to be like feeding in the same person over and over again. Okay. So that looks good to me. Let's now retry this. We're going to fetch some columns. I'm just going to map them one by one. So the LinkedIn URL of the person, which is what we care about. The creator is there. Name is their headline. That's just info. I know it's called a headline for LinkedIn. Uh date stamp. Put that in there. And then likes on post was engagement, right? So engagement likes. Cool. All right. So now we should have I don't know 13 creators that I'm about to add. 14 creators. Let's execute this workflow. Let's see if we can add them to our first database. Probably going to bump me down quite a bit. Nope. Looks good. Very cool. Look at that. Cool. And we also have the number of likes on their posts. Looks like this Danielle Mocha is uh pretty big into coding. You know that uh Danielle Mocha, you know that a creator is big when they just have like a simple headline like I help you craft better software. It's like that is some big dick stuff. Okay, cool. So, if you think about it now, what have we done? We now have like our initial uh database instantiation. Okay. So I can just pin all this and I'm just going to create a note which I think is shift N. Is it not shift N? Note sticky note. Oh, nice. They upgraded the sticky notes. They look really clean now. Uh what I'm going to do now is I'm going to say initialization first. run the top flow to find a list of creators who make posts that regularly get 100 likes or more. Okay, so this is sort of like our step one, right? And this will just make it really easy for everybody to see. Let me say modify the search term to change your target audience. That looks good. I don't know why I can't see this. Oh, yes, there we go. Okay. And then where's the search term? Search queries. Let's just do that. Search queries in the JSON body of the HTTP request to change. Okay, cool. All right. So, now that we've done this initialization thing, okay, we can compile a list of LinkedIn creators that publish in the niche that we want, which in our case was through this appy actor and was also, you know, dumped into a Google sheet called creators. Now that we've done this, and the reason again why I did this is I just wanted the system to be a lot more granular by putting the ability to track not just like a list of people that we wanted to use our parasite system on, but a search that we could use to instantiate any list of people. We made the system just a lot more flexible, also valuable. Now, the next step is going to be, you know, actually filtering. Sorry, actually, we've done that. We've done the step, too. The next step is going to be scraping all of the most recent posts of these people, adding them to a second database. So second database in our case is just going to be another sheet. So instead of creators it's going to be posts maybe source posts we'll call it that and then we're going to source posts and then after we're going to go I don't know let's just call this destination post and then um that's what AI is going to do for us. So hourly, daily or weekly scrape their most recent post add to second DB. So if I go back to here I'm just going to go and add a little bit of documentation ahead of time. So I'll say scheduled scraping and we can fill the rest of the stuff out later. And now instead of using a trigger workflow, what I'm going to do is I'll use a schedule trigger. Now I am going to have to use a trigger workflow to test this out. But you know if you think about it, we need to set some sort of cadence. So maybe I'll do it once a day here. Okay, trigger at midnight and then every day at midnight we're going to start this flow. So the question is what the hell do we start? Well, if you think about it, we now need to scrape LinkedIn posts themselves based off of the person that we're doing this for. So I'm going to go back to Appify. I'm going to find something that allows us to scrape. Oops, that's not what I wanted to do. LinkedIn posts by the ID of the person that we're feeding in. So, I think this might be it. I don't know for sure. I'm just going to open up a couple more because clearly last time that I just opened one. Yeah, you know, didn't actually work as well. So, this looks pretty solid to me. Profile post bulk scraper. Is this the same one that I'm using right now? I don't know. Is it? We'll see. This looks pretty cool. Cool. Now, we can feed in a list of target URLs. How amazing is that? This is what I'm going to use. At least what I'm going to try start using. And what I'm going to do is just cuz I want to make this as relevant as possible. I'm actually going to feed in a bunch of the URLs from my sheet and I'm just going to see if they work. So this person here with Daniel Moco with the 2000. Let's feed them in. Looks like I got a bunch of additional like query parameters here. I don't know if those are going to ruin my search. Hopefully not, but let's do that. Maximum number of posts to scrape for each input, which overrides pagionation. We'll just do 10. Posted limit filter. Oh, this is really cool. We can actually only find posts that were posted in the last 24 hours. So, this is really valuable because it means we don't have to implement any additional logic here to like dduplicate posts. If we run ouruler at the same rate that we do this, then it's only going to grab posts from the last 24 hours. So, let me just double check and see. Are there any posts that Daniel has posted in the last 24 hours? Probably because looks like he makes money off LinkedIn. Yeah, two 17 hours. Yeah, from one week ago. So, we actually are kind of lucky. Um, why don't I just set this to a week though and then for now I'll just set the schedule trigger to a week instead just so that like matches up and we can actually get some post to work with. Uh, we'll also do that for Manali as well. And then what do we want? Do we reactions comments advance? No, I don't really think we want anything. Let's now save and start. What you'll notice is um, you know, I don't actually spend all of my time inside the NAD canvas because I want to verify that the stuff that I'm trying to do works in the first place. I mentioned this a few times throughout my live build, but this is basically like starting at the end, not the beginning. So, I'm starting at the end thing that I want to do, which is scrape posts. And I'm hard coding it all into their dashboard. And only once I verify that works do I actually like try and, you know, tackle the API side of things. Cool. And it does look like this worked. We have content here, which is really cool. And this is what we're going to be feeding into AI. Wonderful. Oh, look at this. So, so cool. Okay, we have everything that we need. Post images. Looks like we can grab some images. We could actually iterate through the images, feed these into Chat GBT alongside the post. Then maybe we have some comments as well, which is kind of cool. What am I going to do now? Well, if you think about it, we got to do two things. One, we need to iterate through all of the people on the sheet. And then for every person, we just add them to the search. So, let's just do that second half first. We set up the search logic cuz that's pretty easy, right? We just go back here. We grab the ID. We swap out the ID of the old search for the new one. API token is going to remain the same. Then, all we need to do is we need to adjust this body. Okay, so what is the body? Feed everything in here. And let me just go to JSON formatter again so I don't screw this up. It looks like we feed in all of the JSON here. And then we have an array called target URLs where for whatever reason this is spaced out. I don't know why. Where we actually feed in the data itself, right? So that's cool for me. No major issues here. Just want to double check. Okay, so now can we actually run the exact same thing manually via API? So unpin and test. Let's run it. saying waiting to execute which is a great sign. Usually when it says waiting to execute it's actually going and executing. Nice and it actually went through and executed. Cool. So now we have all the data. So we've now like basically we've already made it all the way down to scrape their most recent post or ready to add it to a second database. The thing is we need to get the input into this. So right now we're hard coding. So what's the input going to be? The input is going to be a list of all the creators here. Right? So how are we going to do that? We need to add a Google sheet. So I'll go to the Google sheets and then I want to get all the rows of the database. And this is just going to proceed logically top to bottom. You know, we're going to grab all of these people. Then we're going to get all the posts for each of these people, dump all of them here, and then for each one of these posts, then we're going to do some rewriting and then dump them all over here. And at the end of we're just going to have a big list of stuff that we could post. That to me makes the most sense. I may be getting ahead of myself, but I think this is going to work. So, let's again go into the same database that we had before, LinkedIn parasite system. This time, the sheet that we're going to grab is going to be creators. And then uh let me see here. Like normally I would implement some logic at this stage to make sure I'm not like scraping the same creator between a certain period of time. But uh what's really cool about the Ampify actor is that it clearly just allows us to do only posts that are, you know, within 7 days of today. So that's really awesome because that just means we don't have to worry about any of that stuff. Fantastic. So yeah, I mean to be honest, I think this is all I do, right? Yeah, that's literally all I do. So just execute it. We're now going to have a list of 14 items. The question is, how do I feed these 14 items into this uh node? Because check this out. This is 14 items, right? So, basically everything after this is going to run once per item unless I do some fancy logic here. So, what I actually want to do, I need to aggregate these. So, just for our benefits, I'm going to use aggregate. The reason why is cuz I just don't want to um like I can technically just do this through the code editor, but I don't really want to do this. Instead, I'm just going to aggregate all item data into a single list. This is now going to grab all of that and it's going to stick it inside of this data field. We just pin these so that we can run them anytime we want. Hm. Do we actually want to output all fields? No. Right. I guess the only thing we really need is the LinkedIn URL. So, I'm just going to aggregate this linked in URL. Let's try executing this now. We should just get an array with a bunch of LinkedIn URLs. Yeah, looks good to me. Hold on a second. What was the format that we were feeding in here? Oh, it's just a item called target URLs with a big list of URLs. So, we actually need to flatten this. Um, can we flatten this? I don't even think we need to do this. Uh, why can I not think of this into a single list? Paste output and field data. If I just leave this empty, what happens? Will I just dump it all into one thing? No, it's an array of items that are linked in URL. Well, this is annoying. Uh, let's feed this into data for now. Cool. Looks like we got that. Now, once we have this, I mean, you know, now I'm just going to go into edit fields and then map this, which is kind of annoying. Like, ideally, we wouldn't actually have to do this, but whatever. Let's just call this target URLs. That way, it'll match up basically perfectly. The output data is going to be an array data type. And then, what are we going to do? We're just going to map. So, I'm going to grab data here. So, we'll go dollar sign JSON. Data. And then, what do I want to do? Right now, it's an array of LinkedIn URLs. I'm just going to map. And what I want to do is I want to grab the item for every item I just want to turn item. URL. So now we should just have an array of just the LinkedIn URLs which is fine. So if I execute this now we will get just the target URLs which is nice. Now if you think about it, what do we do? Well, we just feed in this target URLs right over here. I at least that's the idea. So let's just try feeding in target URLs. We're going to have to change this to an expression field. It looks like which is going to muck up my formatting. So, go back here, paste that in. And what are we seeing? H doesn't look good. Yeah, for whatever reason, it's um stringifying this. So, can we just go to array? Uh, nope. I do not like this. We cannot just map this item. No, it doesn't work. What the hell did I normally use? Okay, so let's just go back here, execute this. Uh, we got an array. Oh, yeah, my bad. Sorry, I swapped this. So, let's paste that in. So, now we should get big list of target URLs. And then over here, um, the trick is, which took me a second to figure out, and we actually feed in target URLs here. Then we have to go cuz right now, right, this is just a giant string. If you think about it, this is not proper JSON. So, we have to do is we have to go to JSON string. That's actually going to convert this into an array in the format that we like, which is fine. So now this should work. I'm not entirely sure, but I mean it looks good to me. If we're not sure, we could feed this in, format it, and then if it actually works with the formatting, then we're fine. Then we'll uh Yeah, we'll go back here. Now we're going to execute the step. Let's see what happens. Oh, you know what? Look at this. This one's broken. This is not a URL. Ah, spotted. We need to trim these. Need to definitely trim these. Uh, still looks like it actually worked. So maybe it's not that bad. Okay. Well, never mind. I mean, hey, if it ain't broke, don't fix it, right? So cool. We fed one item in. Now we have 99 items here. The question is, what do we do with these 99 items? We have a couple options here. You know, we could split this workflow up so that there's a scheduleuler which dumps the records into source posts and then another which dumps it into destination posts. I typically like these because they're less dependent on any one individual workflow. Because if you think about it by splitting them up and stuff, you know, if there's some issue here, if it screws up with workflow one, then it's contained to workflow one. If it screws up with workflow two, it's container workflow two. If it screws up with workflow 3, it's container workflow 3. So, we can still have data like working through the system. This will increase the number of Google Sheets API calls that we're going to do, which I am usually kind of worried about just cuz Google Sheets and N sort of have a tenuous relationship, I would say. But I think that makes sense here. So scheduled scraping of source posts. And then here we'll say now a scheduler will run once per however long like just modify the schedule trigger and add source posts from these creators to a source post tab. Right? That looks pretty good to me. Once we're done with this now um we obviously need to add the sheets. So let's go to Google Sheets. Actually we can just copy most of this logic. Right. The thing is the data will be a little bit different. So, let's unpin this. Open this up again. Um, what's the data now? Well, I want to get all the data because I'm going to use all the stuff in AI. So, let's go all the way up to the top here. We need the ID of the post. So, I'll go ID post URL content. That's great. Um, what else? We can have the author information. Usually when you have data that references itself like this, it's good to have the same field in at least the primary key in all of the tables so that you can very easily do data matching later on. So I'm going to grab the LinkedIn profile URL of the person as well. I don't need the name or whatever. Um I don't really see any reason to do that. We'll go post it out as well. Then if there's any post images, we should definitely get the images. So, hm, how should we do this? I think it makes the most sense to like insert three images. Like, I don't want to do all of them. I don't know. I mean, just 8020. Like, we could theoretically get all the images, but I don't know. Does that really make a difference to us? I don't think so. So, I think I'm just going to grab like post image one, post image two, and probably like this covers 90% of all posts. Maybe I'll go post image three just in case. Then, we can actually just hard map them, right? Go URL one, two, and then here we'll go three. Okay, so this is what I'm thinking. This is what's going to populate in my head right now. Anything else that I may need? ID, post your URL content. I mean, like there's a bunch of other information here. Transcribe document UR. I don't know what the hell that means. What the hell does this mean? H fascinating. It's like a document that somebody created about this. Wow. Cover pages. What's this? Python for beginner. So, it looks like somebody actually here created a document, which is wild. There's reposts. Okay. Anyway, no, I'm not gonna not going to worry too much about this. I'll just leave it at that. Um, so now what I'm going to do is I'll go back here and then I'm just going to delete all these fields cuz these are fields from the old thing. Then I'm going to uh refresh the column lists. Oh, sorry. I need to change this. This is sheet one. It should be source post now. So now I'm going to map all of these one by one. So ID we'll do post URL which was right over here. Content LinkedIn URL the creator is right over here. Posted at it's right over here. Post image one URL Now this is going to say it's undefined right now. That's okay. JSON post images. Then it goes to post image zero and then it grabs the URL. That's all right. We'll go two and then we'll go three. Hopefully, if it's undefined, it'll just be empty. I don't actually know for sure. Now, here's the thing. We have 99, you know, we have 99 API calls basically that we have to make. So, I don't want this to bust me and my Google Sheets. So, I'm just going to click minimize API calls and then I'm just going to pray. Okay, pray with me. This is one of the most common failure cases in Gachas and N. You try and dump like a hundred things in and just does not work. Okay, so I'm going to go back here. Uh oh. Well, I think I might have spoken too soon cuz it looks like it did indeed work. Wonderful. Nice. And yeah, the post images. Looks like all of these only had one post image. Let's just open this up and let's see. Cool. This is great. Yeah, lots of data in these post images. Wow. Nice. Nice. Type me on the five big personality traits. It's pretty cool. So, now we have everything that we need to actually proceed with the GPTification of this content. Right. So we now hourly, daily, weekly. There one more thing. I forgot about the researching and tweaks post. Okay, you know what? Let's not research and tweak it yet. Let's for the next thing. Right. So now we're basically like modularizing these. So this for instance, it's one module, right? This is going to be another module. That's what we just did. And then this over here is going to be like the third module. Actually, I think we're going to need four now that I'm thinking about it if we really wanted to be smart about it. So, this is kind of like what our infrastructure looks like. Okay. So, any anyway, for each post now, we need to pass into an AI prompt that researches or tweaks the post. So, what does this look like? Well, if you think about it, just going to pin this. Now, all we do is again on a schedule trigger, whatever the schedule trigger is that we want, we do the exact same logic. It's just this time instead of getting uh creators, what we do is we get posts. So, I'm going to go down here and I'm just going to call this scheduled parasite flow. Now, a schedule will run once for however long you'd like. Just modify the schedule trigger and it will feed the post content and images into AI to have it rewrite things in your tone of voice. Let's say to modify your context prompt, you know, adjust the context section of the AI of the LLM call. Cool. That looks good to me. Let's now actually create this. And let me just go up here, write three. Let's actually now go and uh create this. So, what we're going to do is we're going to trigger this based off a schedule. And then the Google Sheets that we are going to read are going to be not the creators now, but the source posts. What I'm going to do, let me just unpin all this. Goodbye. Pin. Then we're going to scrape through the source posts. This is another one of those like API call things. So, I'm going to return only the first matching row. Why? Just cuz I don't want to like trigger a million API calls. Should only get one row. Okay. Well, it just went through and it triggered all of them, I suppose. So, there's that. Thank you. Thank you for returning only the first matching row. No, I'm just kidding. Uh, let's just pin these now. And now the question is, what do we do with each? Well, with each we have to feed this into AI and do something with. So, we're not going to aggregate these. What we're going to do is I'm very conscious of rate limits here. When you're pumping in a 100 things at once to AI, AI can take a variable amount of time in order to do any of it. So, what I'm going to do is I'm going to add a loop over items split and batches node here. And then I'm going to build my logic that proceeds right over here in this loop uh node. So, what's the first thing that I want to do if you think about it? Well, the very first thing I want to do is set batch size to one, which I did. Now that I've set batch size to one, I want to feed things into OpenAI. So, I'll go OpenAI. What I want to do is I want to message a model. Now, if you haven't connected to OpenAI before, give this a tap. In order to get the OpenAI API key, all you do is you head over to platform. openai. com, create an account if you haven't already. Go top right hand corner, go API keys, and then you can actually create an API key here in a second. All you do then is you paste the API key in here. Don't worry about anything else here, and then you're good to go. So the operation that we want is message model, but the model some sort of web search. I'm going to type in web, maybe search. Uh I think it's GPT4 search preview. This is what we want. The reason why is because what we want to do is we want to search things based off of the content inside of this and we want to see if we can like modify it a little bit. So what I'm going to do is I'm going to start by searching the post itself. Once I've searched the post itself, I'm going to see if there's some additional sources I can draw in. Then I'm going to feed the content of those sources plus the original post into a second prompt. That prompt is going to make some tweaks to the original prompt, then turn into just a bunch of bullet points. I'm going to take those bullet points, feed into a third prompt, and that prompt is then going to rewrite it in my own tone of voice according to my own style. Okay, that's my idea anyway. I don't know for sure if any of this stuff's going to work out. I may sound very confident about it, but rest assured, I'm just whipping it all out of my butt. Oh, GBD4 search preview. You're a helpful intelligent research assistant that searches online for things related to target to social media posts to the content of a social media post. Let's do that. So, your task is to take as input a social media post and find three sources that discuss similar things to that social media post. You then return brief summaries of each source in JSON using this format. Then I'm going to define my format. The format is going to be very simple. I'm just going to hardcode the actual elements themselves. I'll call this um source one summary. Here will be source two summary. And then over three summary. Why am I hard coding this? Cuz I can. It's my data format. I can do whatever the heck I want. Then rules, I'll say be very comprehensive with your summaries. Uh, well, I guess that's kind of like an oxymoron. A summary is by nature something that shortens and makes it really simple and then comprehensive goes against that. So, um, lean towards detailed summaries. If you can find new information not present in the source content, make sure to include it. Ensure your summaries are at least one paragraph long. Okay, output content is JS O N over here. And then the temperature I'm going to use is just 0. 7. I always do this. Why? I don't know. Just to force a habit, I suppose. Okay, so now we're going to feed in one item into this flow. Cool. So, we have the item fed here with the loop branch. And what I'm going to do is I'm going to feed this into AI. Now, uh, can I pin this? I don't believe I can pin this, but I can probably pin this. So, I'm now going to execute this step on just one item. Okay. And I just want to see what it says. Okay, never mind. Turns out you can't include temperature in a GBT40 search call. I don't believe they support it. So, we're just going to execute this without the temperature. Unfortunate, but is what it is. Okay, never mind. I take that back as well. We can't include JSON object. So, well, that kind of sucks. Screw it. Let's give that a try. What else? Can we not include the system prompt? Looks like we'll just have to parse the JSON ourselves. So, content. Nice. All right. Well, I mean, you know, it still output things in JSON. Didn't output things in like the best JSON, but it still did in JSON. Do we need to do it in JSON? Uh, not really. Does the JSON need to be perfect? I suppose no, it doesn't, to be honest. You know, if it doesn't allow us to output in JSON, then that's okay. We're just going to use the JSON output. We're actually just going to feed in the next thing. So, why don't I now rename this? Let's just rename this find web data. Okay, I'm going to copy this now. I'll feed this into my next step. Why? Because what I want to do now is I want to generate unique outline. Now, what I'm going to say is instead of you being a helpful search assistant, I'm going to use GPT4. 1. It's the current best available model for me. Instead of being a research assistant, you're helpful, intelligent writing assistant that takes content and some additional research and adds unique twists and additional and more information to it so that it reads differently. Okay. So I will say your task is to take as input a social media post and a bunch of supplied research and generate an outline of that content. Let's say a detailed comprehensive However, you must do one additional you must also do one additional thing. Make that outline unique by incorporating elements from the research as well as your own knowledge. Output your unique outlines in JSON using this format. I will say this is outline rules. keep all of the source content or elements from the source content but make it better, more comprehensive, more interesting and or add a twist. Do not repeat the content verbatim. Make your outline different while ensuring you maintain the sameformational content. Minimum. We are not copying the source. We are improving it. Cool. All right. Um, output your outline in markdown. Let's do ATX format. um aim for I don't know between five to uh let's be comprehensive let's say all right cool so that is what we're feeding in here um that's pretty cool right so now what we're going to do is now I can output the content JSON now I can add output randomness temperature let me unpin this and now we're going to grab that previous data I guess I do need to provide that previous data Okay. And um also the source. So JSON message content and then loop over items. Uh no items were sent on this branch. Sorry, what the hell are we talking about here? We do need the content, right? Yeah, content. Okay, so let's now feed in the content. So um this is a piece of content. I don't know why we don't have access to this. That kind of sucks. Let's do this. And then what I want to do is actually need to do loop over items do item. json. So I'm trying to reconstruct this right now even though I don't have access to this which I think should work. Then research will be here. Okay. So we're just feeding it some very basic formatting here so it knows what's research what's um not. Now we should be able to execute this. This should execute first. Right. Find some web data. Once it's done finding web data, it should now generate some unique outlines or one out unique outline for us. Getting text on text. Double checking. Cool. Nothing super urgent. This is obviously taking significantly longer because we're using um you know an intelligent model and we're also asking it to do quite a bit intellectually speaking. Cool. This is very intense. So, this is almost like a blog post it looks like for that source, but that's okay. Um h what else? We're not actually feeding in the image right now, right? So, maybe we should actually do more. Maybe we should feed in the image as well. So, how do we do that? Um you go uh llm sorry chatp no openai analyze image right now what you do is you actually feed in the source image. So the model we're going to want chat GBT4 latest um we'll say describe this extremely comprehensively. Then image URL that we're going to feed in is just going to be the uh post image URL one and we're you know we should only do this if the image is there. not there then we should not feed anything. So here's what we should do. Screw that. We're actually going to do this. Um, we could just merge the web data and this and we could switch. So, I'm trying to think about like how complex do I want to make this thing look cuz the more complex it looks, the better it'll be on YouTube, but the worse I think it'll actually be in practice cuz generally speaking, you want to stay away from complexity wherever possible. If you can make things simple left to right flows, that's the best. So, you know, if there's a reason why you guys are like, why are the left to right, that's boring. It's like, you know, we do it on purpose that it's maintainable. It's simple, straightforward, anybody can read it. It's kind of like um documenting your code sort of, right? Or writing documentable code. Uh yeah. So, here's what I think we should do. Should probably add a switch here. Now if there is a um if there is an image URL so if uh we go back here to Google Sheets and then there's a post image URL one. So if this exists, okay, then I want you to go down route number zero. Then if this does not exist, I want you to go down route number does not exist. Then I want you to go down route number one. So let's rename this to image exists. say no image and no image that's simple okay so now if you think about it um you know if there is an image then I want to feed the image in here can I not feed this in yes I can't then if there's uh no image I can actually just go directly to generating a unique outline if there is an image though then we'll feed this in alongside the image we don't need to do it this way but I think it's simpler to write it this way. So that's what I'm going to do. And then if you think about it up here, we also need to feed in the output of the previous node. So what is node going to be? H, that's fine. Um, yeah. Can we now execute the step? Okay, we're going to find the web data. We need to pin this now so I don't have to rerun this every time. Okay, cool. And this route had an image. So because it had an image, this is now accessible to me. The URL I'm going to be doing is let's go here. So, post image URL one, but then it's actually not this one, right? It's the loop over items node. Don't know why that keeps on happening, but it does. So, we should have the post image URL one. Hm. Well, hold on a second. Why are we going down the image route if there's no image? I don't understand. Uh, no. H, strange. It's because I'm not using the right thing. We got to go loop over items, right? Okay. So, this should go down the no route because there's this is empty. Yeah. So, I don't know why I went down the route of good. That's not right. It went down the image route again. Huh. Weird. So, sorry. Exists is the wrong one. It's probably empty. So, if it's not empty, then go down this route. If it is empty, Let's try this again. might have just be mucking up my uh exists versus empties because they mean slightly different things. Okay, cool. So, no image here. All right, so that's actually fine. Now, we generate the unique outline. Cool. That should be fine. Very cool. And then what if we actually do output an image? Then we are going to do the post image URL one, which should be good. And then we're going to feed the output of this which is not going to run unfortunately. So why don't I just hardcode this first. Then once we Yeah, let's just grab an image and let's see what the output is. So this looks pretty big, but um let's do this. That's also pretty big. That is also really big. Good god, man. That's huge. Can somebody please not post a image that's just a bunch of text? Okay, this is kind of funny. Let's do that. Let's just hardcode this now. And let's disconnect this. Let's run this in isolation. Why am I writing this in isolation? Because I just want to see what the data format is. Once I know is, I can actually hardcode this into my flow even though I don't have access to it. Looks like it's just content. So now I can feed this in. in here. And now what am I going to get? I'm going to get a variable called content. So I can just feed that in content supplied image description. Let's go description of supplied image. Feed in the content variable here. And then research. Okay. And then now we have the ability to generate a unique outline with the top route. We also bottom route. And now what do we do? Um, well, now we just need to merge them if you think about it because like we want both of these rows to proceed, right? We or rather we want it to proceed regardless of what the row was. So, I'll go merge next. Feed this into input one. Feed this. Oh, no, no, don't feed them both in input one. Feed this into input two. Like, it doesn't matter which one because only one of them is going to run. So, this is just always going to include, you know, one of these outputs, which is cool. I don't really like how this looks. So, I'm just going to mouse over it and then go shift option T. Oh, wow. That is way cleaner. Nice. And then once I'm done with the merge, what do we do? Uh, well, we actually feed this into a third model call. Getting really crazy with it. And the purpose of this is going to be regenerate content, right? This is going to be generate unique outline with image. Uh, let's say image description. H I should probably rename this. Generate outline with image description. Maybe with image description like that. Yeah, there you go. And this one here will be generate outline. No image description. Oh, I don't like that. That looks good. Now that's probably a lot more interpretable. Merge. Then we're going to Let's do generate the new content. Okay. So, what are we going to do now? We're actually going to go through and adjust the prompt so that we are having the model generate new high-quality content based off of what we just asked it to do. So, you're a helpful intelligent writing assistant that takes an outline and uses it to write a high quality piece of content for LinkedIn. We'll say your tasks take as input an outline then output a high quality LinkedIn post that adheres to character limits. Uh well I don't actually know high quality LinkedIn post um output your LinkedIn post in JSON using this format post body rules I guess. I mean what kind of rules do we have? Yeah I guess just my own writing rules right? So, let me go through and supply it a big list of rules. I just have to go grab my prompt from elsewhere. Cool. And then here we'll go. Do not be overly engaging. I think that's probably good. Uh h be spartan and relatively informal but maintain a professional curious tone. Okay, that seems pretty reasonable for me. Uh, so I just gave it a bunch of rules now on the sort of tone of voice I want it to do. And now if you think about it logically, what I have to do is I just have to feed it in the data, right? So let me just disconnect this. And I just want to run all of this now, at least once for one thing of data. So let's run this. Get it all the way over here to the merge node. Why did the workflow execute successfully? Seems strange. H, I'm not liking this. Why don't we feed this in? Okay, let's pin this. Okay, we're going to go down the no image route. This should now execute. Okay, so now we have the outline with the no description route. And so what you know is logically going to happen if I execute this merge node. Well, we should only have the output of one of these because only one of them is going to be available. So now we can actually pin this, delete this, feed this in to generate new content. Now we should be able to execute the generate new content route and it should write me something about Python hopefully in my own tone of voice. H interesting. Yeah. Okay. I mean that seems reasonable. This to me is obviously like very poorly written initial post. So mastering the Python interview a strategic and analytical approach. Um the reason why is because the data itself kind of sucks. Cracking a Python interview isn't just about knowing the syntax. It's about problem solving, writing efficient code, and handling real world challenges. Here's how I structured my presentation, mastering the fundamentals, revised Python basics, data types, loops, functions, blah blah. Oh. Oh, no, never mind. I'm just not feeding it in a prompt. Yeah. Yeah, that's what's going on. Outline. Good God, man. I was wondering what that was going. Okay, now we're feeding in the actual outline. Okay, let's go back here now. Paste this in. I don't really like this. I don't really like this either yet. But again, the reason is I'm not actually feeding anything. Cracked a Python interview lately. Notice it's rarely about syntax or code recall. Real test is in creative problem solving efficiency and how well you navigate ambiguity. Same skills driving sentiment analysis or any data driven field. Foundation matters. It's not enough to take off list sets, control flow, and op. What counts is applying those tools. Take text data from social apps. parsing, modeling, extracting meaning. That's where Python's core shines. If you built out a s a sentiment and analysis pipeline, you know what I mean. Data structures in play, functions doing the heavy lifting, objects keeping. Got to go deeper than basics. Okay, so I'm just going to play around with this prompt a little bit till I find something that I like. Okay, so I've mucked around with the prompt a little bit. Um, I just added uh like a tone of voice thing that I got from a writer that I follow online that also experiments with LLMs, which worked reasonably well. talked about the level of formality being inverse to the topic's novelty, the classic style of western writing. Big Finn of Guin, if you guys know who he is, uh, give him a shout out. Aside from that though, you know, the quality of the content is still all over the place. And the reason why is I just haven't given it any examples. So, what I'm going to do next is I'm just going to give it some examples of stuff that I've written before in the past and then give it examples of how I would write an outline. So, you know, they provide me an outline, I create the post. Now, am I actually going to go and, you know, do all of this? Like I could if I wanted to, if I treated it really seriously, but no, instead what I'm going to do is I'm going to go to my LinkedIn. I'm going to find posts that I've actually created. Okay? And then once I have posts that I've actually created, I'm going to pump those into AI and say, "Hey, I want you to reverse this. Turn this into an outline. " Okay. So, let's see. What do I got? Um, I don't know. Can I find post? Do I even have a post on LinkedIn? No. I actually don't know if I've ever made a post. Let's see. Right. This is me back when I was doing stuff like this. So, let me see. Okay. H, you know, maybe instead of posts. Oh, okay. This is cool. Six years ago, I wrote this. Wow. In the modern world, money is a proxy for power. Power freedom. By way of equivalence, then the money you possess is directly correlated to how much freedom you have. Nice. I really like this. So, I'm just going to actually go to chatbt, open this up, and then what I'm going to say is turn this, let's do 4. 5. Turn this into a markdown ATX outline. Be detailed and comprehensive. Then I'm going to take this outline and then I'm going to use it to regenerate what I'm doing. Is this Markdown ATX format? It's kind of all over the place. This is pretty all over the place to be honest. How do I turn this into markdown? Maybe generate markdown from rich text to markdown converter. Can we please turn this into the markdown, please? What do I do? How do I do this? Copy markdown. Oh, yeah. That's good. Cool. So, what I'm going to do now is I'm going to go over here and I will say example of a thing. So, um outline go. And then over here, I'm going to go assistant and I'm just going to regenerate my post, right? It's just uh one more thing I got to do. I got to I think I have to replace with new lines. I think I don't know. Okay. And then here I'm going to go outline. Now I'm actually going to feed it in set outline down here. And now it has one example of my writing. So it should perform a little bit better, a little bit closer to what I want. And I can just remove this. Okay. So we're going to give that a go. Let's see how that works. We're obviously going to consume more tokens for this. But peak is I think between two to five examples right now. So if that doesn't work, I'm just going to provide more. All right. So let's see what the new content was like. Source content was over here. It's about problem solving. And then the output is crushing a Python interview isn't just about spitting out syntax. It's a whole art form. Create a problem solving, write efficient code, handling real world scenarios. Exactly the stuff that powers real analytics like wrangling, social media, sentiment analysis. Analytical thing and adaptability matter here just as much as in any data driven field. Start with the basics, but really know them. Data structures, control flow, blah blah. Use them to build things. Okay, cool. Nice practice every day. Write code that's clean and sharp. Okay. So, how we doing? What are we thinking? Is this better? Is this lower quality? I mean, like, you know, do I think that the original post is high quality? No. I don't really think this adds anything super valuable if I'm honest. Not to dump all over your Manoli. Uh, you know, I just do this stuff pretty often. So, yeah. Um, yeah. I think my post is better. I think it's laid out nicer. I mean, personally, I find like the formatting or whatever kind of tacky. Uh, I think that was an exclamation point. Maybe it was a period. I don't know. Okay. So, this is the output. 373. This was the intro. Looks like there were some tags as well. Now we can add tags if we want, but I don't think that it's entirely required. Looks like our output's longer. Now, this just owes to the tone of voice that I've supplied it with. You don't have to supply it with the same sort of tone of voice. Obviously, you could add uh I don't know, emojis if you wanted to for Christ sake. You could do it like she did over here. Yeah. Source to final products are reasonable. I'm not going to say this is the highest quality thing, but obviously your definition of quality depends on the specific prompt style that you're going for and so on and so forth. I'm kind of curious if I generated this in GBT 4. 5, would I be happier with this? Let's try. Okay, now generating the new content with a different model. This model, I should note, is significantly more expensive. So, do I actually recommend this model in practice? No. I'm just throwing some stuff around, seeing uh which model might be the best. Okay, we'll go down here. Yeah, you know, it actually works uh significantly better, I would say. I mean, I'm just obviously subjective here, but I just love the way that it wrote this. Um seems kind of pointant. There's no crazy leading question, which um AI tends to do. They asks these rhetorical annoying ass things that like we all know the answer to. Uh, and I think I got my tone of voice down better. So, yeah, I'm probably just going to use GBT4. 5. Cool, cool. All right, sweet. So, what do we do after we've generated the new content? Well, obviously we need to add it, right? So, last thing is we just need to append the data to our sheet. Let's go over here. Then, what I have to do is instead of source post, I'm going to do destination posts. And if you think about it, I haven't actually added the data structure yet. So, let's go to destination posts and let's just copy most of the same stuff. Okay. But then let's go ID post URL content. Let's go generated content. Let's say original post URL original post ID. Original post URL. Generated content. Original LinkedIn URL generated at and then we can go post status here. So I'm just again pulling these out. I don't really know for sure if these are going to be the ones that we go with at the end of the day, but seems pretty good. And now we can map all of this data. Now, where am I actually getting the data from? This loop over items, but um you know, we can't actually do it here. So, I do have to remap all this, which is kind of annoying. Loop over items here. Item JSON ID. So, we're going to grab it from there. Sorry, we're going to grab from here. And then original post URL, I think, was just called like post URL, right? Let's try that. This generated content, I believe, was just called content. Then this original LinkedIn URL was called LinkedIn URL. Then this generated ad is just going to be our own variable, which I'm going to call now. Oops, dollar sign now. Like that. Then post status here, I'm just going to leave blank. I do have to do some unpinning here in order for the system to work. So I'm going to unpin this. And then now that we have all this, I'm going to run the append. And then I'm just going to loop this back over to the um loop over items node. And now I'm going to click execute workflow. We're just going to see what happens. Workflow executed successfully. So the reason why I did this is because we don't actually have any data here. And because we pinned one of the nodes. So I'm actually just going to manually carry it all through. So we're going to do one manual run again of everything. Once it's done with this, we will go here. Then we're going to execute this. It should be easy and fast. Never mind. I just reran it again. That's annoying. Remember to pin your outputs, folks. Otherwise, you got to do this over and over again like me. And believe it or not, just rerunning over and over again is like one of the biggest time syncs in development. Like I would have saved 15 20 seconds there. You do that 50 times over the course of the development. That's like 1,000 seconds or I don't know about 20 minutes or so. Okay. So, pin generate new content here. output that pin here we should be able to generate hopefully if not we do have to run this end to end do a big end to end test we should do an no matter what uh I always recommend it but generally I don't want to like end test everything all the time right okay yeah I know we can't get the merge node execution here why cuz we're running into an item matching issue yeah okay I do I think we do need to do an end to end test which is really annoying. I hate doing these but we'll keep these two pinned. Everything else should flow. So, let me just add one more thing. Um, if we don't add this could be a problem. So, I'll wait. The reason why is cuz I just don't want my workflow to start going without me. Just going to loop this around and then we're just going to do I don't know like uh let's say a 5second width. Okay, let's now execute this from the get- go. We're going to find web data. What is going on with NA down here, man? Says it's saved. Is it saved? Well, if I just lost it all, then I'm an idiot. Oh, looks good to me. Oh, I get Oh, yeah. Sorry. My bad. This whole time I've been running and it's been running the top route. So, I mistakenly thought that I could get away with this. I can't. So, okay. This is going to be the schedule trigger for now. And now I'm going to execute the workflow. Good god, man. What a day. All right. And then testing time. That first item did not have an image, so we went down that bottom route. What are we going to do now? Because this here was empty, we're going to merge, generate the new content over here. Once we've generated the new content, we will go through the Google Sheets node. This will obviously be the rate limiting step in our flow. Looks like we ran into some issue. Why branch not found? Item zero in node merge references a branch that doesn't exist. I don't understand what the hell we're talking about here. Content outline. Hm. Weird. Content outline. What do we do here? Looks like we fed content outlined. So perhaps you cannot do this. Um, uh, you cannot do it this way. Perhaps. I'm not entirely sure. Item zero in merge references a branch that doesn't exist. So what is item zero? Huh. Okay. I guess because this is empty, we're getting some data problems here because I'm splitting up the data. Okay, so because of these item matching issues, I think just going to cut this off and actually split this up into two routes. Is this the most efficient way to do it? Not really sure. I don't like having to just duplicate my logic here. I really don't. But um I care more about done than perfection at this point. So I'm curious if I just went down this route, then I looped back to the loop over items node like so. Then also if I selected this and then I went shalt t. Whoa, what the hell just happened here? Oh yeah, that does actually look better. Um would this now work? Right. So, no image route down here. Generate new content. Let's say no description. Then up here, we're going to generate content with description. This is going to add to my sheet. Looking like we can't get data here because this is the test route. Um, so we're going to have to run this again, I believe. Let's try the switch route stuff again. We'll pin these. And now I'm just going to run it from here, right? Because we have that one item. It should just go down the bottom route. Okay. I suppose we could add a merge at the very end. Merge the inputs and then just have one route instead of two. That's probably simpler. I would take it. And because it's at the end and not at the beginning, this should not meaningfully impact our data. That way we could get away without duplicating that last bit. We'll see. The question is, can we just append this to the sheet? Right. Because that's what matters. Let's see. So, that looks good. And now we're waiting. Okay, cool. So, that's fine. Once we finish one full loop, we're going to stop this. Thank you. And um now we can pin this. I'm going to keep these in, but then I'm going to add a merge node. This merge node is going to take the output of this. And now this is going to loop back around. So this should now be not as annoying because we're not duplicating the data. Typically you don't want to duplicate nodes wherever possible. Then what we do is we merge. We append the two inputs. We do that. Well, I guess only one of them is going to ever be present. Okay. So we append the two inputs and then we merge them. And then we run again. All right, I think this is good. Let me just try. Should jump through the bottom. We're going to wait. Is this merge going to work? Okay, it looks good. Now, we're running this again and again just because I pinned it. What I want to do now is I want to do an end test of this. You know, should run on like two or three pieces of data. Pieces of that there. Da. Let's go up here. Make this a little better looking. And then let's go back. And now we can pull the same data here. That's fine. Let's just X. Oh no. Oops. Sorry. One more problem. I pin this again. I'm going to unpin that. It's going to stop this. Okay. All right. All right. Uh oh. You know, I just realized that we're waiting here. We should be waiting after the merge. So, unpin that. Now, we'll do a wait. And now we'll loop back. Okay. execute workflow. So, we know that the first route has no image. So, we're going to go down the first route. First rout's looking good. Awesome. We're going to do the same thing now. See if we can find images and then switch the second. Okay, looks like we don't have an image. Okay, we're getting a lot of routes with no images. So, let me just check the main posts. Uh, yeah. Okay, I guess I kind of screwed up there. Um, we did not need to do that. I could have just moved this up here and then we would have tested it. So on the fourth route we should get an image and then on there we can test the top route for completeness. But we are generating the um content pretty nice which is cool. So oh hold on a second. We're feeding the wrong data in here. Oh generated content is currently the content that we're scraping. Yeah. We don't want that. No. Okay. Anyway, we're going down the image route. Let's see what's going on. Come on. Mama need a new pair of shoes. Let's go. Chop chop. I am very scared. What's going to happen? Awesome. We got one item. So now we're just waiting for the other. And great, we verified that route works. Now obviously the one change we need to make is we're not feeding in the right generated content. Aside from that though, everything looks good. These are all well these two are the same but this one here is the one with the image right so that's good so I'm just going to stop this and then we just need to change what we're mapping right what we are generating is post body here and up here we're going to be generating basically the same thing so post that up in here awesome cool now we have a scheduled parasite flow I can get rid of this um and logically speaking we Exit out of that. Add this back. And now drag this up here. Oops. Doesn't look like I got all of it. Drag this back up here. And then rearrange it a bit for simplicity. Let's do that. We'll do the same with this. We'll also Uh this will be my test trigger which initializes the Google sheet. Right? Then over here we will schedule the parasite flow. Awesome. Cool. So now we have a system. And why don't I just run this in the background while I'm talking about the rest of this. So oh actually sorry I do need to steal back this the when clicking execute workflow trigger. Why don't we now just while I'm running this off of these 99 posts talk about that last part here. Okay, so check hourly, daily, or weekly we need to post on LinkedIn using page credentials. Posting on LinkedIn is a massive pain in the ass. You have to deal with like a bunch of their graph uh it's not graph API, it's some other API uh format that they have and it's not very easy. So, what I'm going to do here is I'm just going to go into a previous sign build I've done where I automatically have posted on LinkedIn before. And then I'm just going to take that. And then, if you think about it, last thing we need to do, right, is, you know, when all of this is said and done, we're going to have like tons of posts here. Let's say like five or 10 or 15. We need a way to check this maybe once on a daily basis or something and then just get anything that doesn't have post status equal to like posted and then just post that and then change the post status to posted. So, now that I'm thinking about it, we probably need to update the post statuses so that I don't know, it's like draft or something. Yeah. So, I think that's what I'm going to do. I'm going to call these drafts and then that way we can get all the draft posts and then we're going to sort so that the generated at like the earliest ones are up at the top and then we just start triggering those and then once a day we're going to start the flow and post status is going to go from draft to published. That's my idea anyway. Let's give it a try. My ideas are usually pretty good. Copy this. Let's paste this over here and then we'll say scheduled LinkedIn poster. Let me just copy this. This posts on LinkedIn. You'll have to update your credentials in the HTTP request module. LinkedIn and then updates the Google sheet so that each the posted destination so that the publish destination posts are set to published. Paste. Wonderful. Now, let's move that back here. Now, once we have that, how do we actually do this? Well, guys, you'll notice it's very, very similar. We just do the same thing over and over again, right? So, we're now going to read through the sheet. It's just instead of reading through um, you know, the source post, now we're just going to read through the destination post. Hopefully, you guys see why I've constructed it this way. Um, I've done it very intentionally simple because I want it to be simple because if they're not simple, they tend to be a lot trickier to deal with and maintain. But now I just know that there's like four flows and they each just do something. So that the first flow gets us the list of creators and then the next three flows just like proceed logically down the tables. So first thing we're going to do is we're just going to check to see if post status is equal to draft. So I'm just going to pull all the ones that are draft. We I haven't updated this to actually include draft statuses, but that's okay. I will do that in a moment. And then what I want to do is I just want to verify can I actually get the draft one. So let me stop this workflow, unplug this puppy, which is just my like little tester. And now what I'm going to do is I'm just going to verify. Hey, can we actually get things that are labeled draft first and foremost? So what do we get? We got four items that were draft. Very cool, right? And then while I'm at it, let me just update this so it actually says draft, right? Draft and draft. This is why I don't like having the same data twice, the same nodes twice. Okay, just saving here cuz NAD was down as I mentioned. Cool. So now that we have these four posts, uh what do we really want to do? Well, uh, we only want to publish one, right? So, I don't know. We could get like first matching row and I think that'll actually work this time. Let's execute step. We should only get one item now. Cool. So, we got one item. Nice. What do we do? Uh, we just do our LinkedIn posting logic. So, let me get that LinkedIn posting logic. All right. So, the way that you do this is LinkedIn create a post. You need to first connect a credential. So, to do that, head to the top right hand corner here. Click create new credential. And then in order to post as a person, you need to turn off organization support. Keep on legacy. It's weird. I don't know why, but you have to do this. If you want to post as an organization, just keep both of those tabs on. Anyway, I connected my personal LinkedIn account. Resources post operations, create post as person. I am the person. Hello world. This is going to be me making a LinkedIn post. Going to set the visibility to connections just so I don't like totally butcher myself given the fact that, you know, I do have a few followers at this point. So I will go back to my LinkedIn flow. Now I'm going to go over here connections going execute step and cool done. If I go back to my profile you'll see that I just made a post under activity and the post says hello world. So I'm going to delete this now. So what does that mean? All we do now is we hook this up so that the content here is uh pulled from our Google sheet. Move this there. move that there. We need to input or output um data from the previous node. So, let's execute a step. Let's grab the first draft post here. Now, let's pin this. What I want to do is I just want to feed in the generated content. Now, I'm not really sure if new lines are good. So, we're just going to test this out. Post this on my profile as many damn times as I have to. Looks like that worked. So, let's refresh this. And oh, okay. Looks like that actually worked. Cool. So, yeah, that's how you do it. Now, we actually have a post, which is pretty sweet. Okay. All right. So, what do we have to do just in order to make this kind of close the loop? Well, I'm just going to pin this. I'll go back to my Google sheet. And if you think about it, we need a way to update this so that the post status changes from draft to published, right? So, how do you do that? You just go update row and sheet. We're going to use the exact same sheet as before, LinkedIn parasite system. Now, it's going to be destination post. What we're going to do is we could just add the row number, data, location on sheet, but actually we could just go original post ID. Let's feed that in. Now, it's going to map the right uh the right field for us. And then post status we're just going to set to published. Okay. So now let's test this. Assuming that we actually go through set publishing, this is now going to find the ID of the specific record, update it. Sorry, you need to set the original post ID. I'm a little crazy. Now it's going to find it and then update it. Which means that logically the next time that this runs, right over here, uh runs manually, it's going to grab a different post. And we're just going to do this every single time that we do said post. All right, let's change this to four. Let's move this back over here. And let's duplicate this one more time. We'll set this one at running once a week. And then this one here will run once a day, which is really cool. Connect that. And this will be set back over here. And that is our flow. And

### Outro [1:25:30]

there you have it. a complete LinkedIn parasite system that you guys can build for your own brand or sell to other people for 1,500 bucks a pop or more. Now, if you guys want to see exactly how to sell systems just like this to real clients, I'd encourage you to check out this video right over here where I start an AI service completely from scratch and land my very first interested prospect live in under 10 hours. I walk through creating the offer, creating the positioning, adding the value, and also generating qualified leads. You can think of it as basically a complete client acquisition process from start to finish. If you guys want this template for yourself, I do provide it at no additional cost if you're inside of Maker School. And for those of you that don't know, Maker School is my 90-day accountability roadmap that guarantees your first day automation client or your money back. Whole idea behind Maker School is just to eliminate decision fatigue, simplify the road map to client number one, and just give you a bunch of steps that works. Whatever you choose to do, thank you very much for your time. I really appreciate you watching this. I'll catch you on the next video. Bye.
