# Tracking Crypto Vulnerabilities with A Killer AI Bot

## Метаданные

- **Канал:** n8n
- **YouTube:** https://www.youtube.com/watch?v=a1PS7ZvFg9g
- **Дата:** 21.02.2025
- **Длительность:** 18:02
- **Просмотры:** 8,002

## Описание

What if you could track vulnerabilities in crypto projects before they get exploited? 

In this episode of The Studio, I sit down with Nikita (@NAndriievskyi). He built a crypto project vulnerability tracker using n8n, AI, and custom web scrapers. His client uses this data to send hyper-personalized cold outreach, helping convert those crypto projects as clients.

Chapters
00:00 - Intro
01:24 - Interview with Nikita
03:28 - Nikita explains Crypto Tracker usecase
04:16 - Usecase walkthrough
16:50 - Wrap up



🔗 Resources & Links:
Sign up for n8n: https://n8n.io
Follow Max on LinkedIn:  https://www.linkedin.com/in/maxtkacz/
Join the n8n Community: https://community.n8n.io

#aiagents #aiagent #cryptocurrency #securitybreach #ai #automation

## Содержание

### [0:00](https://www.youtube.com/watch?v=a1PS7ZvFg9g) Intro

yeah hey it's Max the original flow grammar and this is the studio the show where I share the stories of flow grammar across n it ends Global Community de You cryptocurrencies by their nature exist on a blockchain is a publicly inspectable Ledger inherently that means it's scrapable the same is true for its websites related GitHub projects documents white papers it just so happens that there's a great work for automation tool that's good at scraping and transforming that kind of data and has some AI functionalities to process that aggregate it and do some useful stuff and that's why Nikita a work for automation consultant used n ITN to build a crypto project vulnerability tracker for one of his clients his client uses insights from that report to create hyper personalized cold Outreach to those crypto projects for his own business given that with AI there's even more cold emails and outbound and a lot of people doing it badly as you can imagine a hyper personalized cold email citing your own crypto project and a real vulnerability on that project is definitely going to convert higher than your average AI personalization however as with most NN flows tweak The Source app freak some logic and you've got a completely different use case for example a Craigslist Arbitrage bought I'm doing this thing where I'm floating use cases that I don't have time to build in hopes that someone does let's hear more about it from the mat

### [1:24](https://www.youtube.com/watch?v=a1PS7ZvFg9g&t=84s) Interview with Nikita

himself Nikita's one of these guys again I'm on LinkedIn and I see folks building with NAD and I'm kind of reading on the use case I'm like wow that's really interesting would you mind introducing yourself everyone my name is Zita aan and automation agency called Intel scale where we create AI agents automations for businesses to increase Revenue save time or just help them scale or a agents and alation because they're very powerful R and you should definitely use them absolutely I mean you're preaching to the choir here as a guy who works at n ATM Nik give everyone a little context how long have you been around doing this how many clients do you have just kind of for a bit of yeah well essentially used to be an developer and a engineer machine learning engineer and I kind of switched to building no code automations make this com and then switch to na this just makes more sense it's way faster to build animations for businesses with helps over 15 or 20 businesses from anything from small businesses to eight figure companies help them scale and increase Revenue very cool in 2025 what are the kinds of use cases that you're seeing are like production ready you're getting inbound requests for you know what are those Trends content is big we've had a lot of requests build full and content production systems anything from idea generation doing research on competitors and seeing what successful post they make getting ideas from that and then actually generating the content and the style of the client using fine-tuned models and that sort of stuff big problem that businesses have in general is not doing follow-ups or not doing them automatically even like eight figure companies are not doing that enough that's kind of a big thing that it pretty much increases the revenue of the business immediately once you automate that stuff anything you know client related doing research on clients automatically because AI is very powerful you can do a lot of research get insights about the customer like about the clients just be able to communicate with them better so these are the main two obviously there's like many things that go beyond it thanks a lot for the Insight makes a lot of sense and fine-tuning is something in q1 I plan to explore a little bit so I might be hitting you up with a few questions or a collaboration idea because I think there's a lot of scope basically for just small models like all of these customers they have training data in their systems it's just not in the right

### [3:28](https://www.youtube.com/watch?v=a1PS7ZvFg9g&t=208s) Nikita explains Crypto Tracker usecase

format without further Rue and Nikita could you give us a little bit of context on what you're going to be showing us today actually we had a client he does research on new crypto projects and then she finds vulnerabilities in these helps these teams solve these bottom lcks one of the things that she wanted to do is scrape and find new crypto projects and then do research on them then generate a summary of the project and then generate a personalized message to the crypto project team with the vulnerabilities of the project that can potentially happen just with some person the main thing is the scraping the research and the analysis of new crypto projects makes s of sense like a cold Outreach use case but I think like hyper relevant if I got an email to my inbox and someone was this vulnerability and I even recognize some my codebase or something there is something that we have that's got my attention would you

### [4:16](https://www.youtube.com/watch?v=a1PS7ZvFg9g&t=256s) Usecase walkthrough

mind sharing your screen and showing us this use case there are multiple flows and I'm show you the main ones to do the actual lifting the first flow here it finds new crypto projects on one of the sources that we use there there's essentially two sources this is one of them there's a duplicate flow for another source just literally the same but just uses a different scraper so essentially we just trigger the flow actually we don't use a custom scraper we just scrape directly from the source and then we get the project if you want to look at the output it looks something like this we get the date of the project its name and some other information and then we are checking for the new project that we already have on Google Sheets making sure that there are no duplicates and just removing any duplicates and merging everything together then we check if the project is actually new if it's a duplicate if it exists if everything go is okay and we add it to Google Sheets and this one little code block if you're interested this we're just using it for debugging purposes when we want only return the first item the second item only the first five items block here and just says take first five for the last five so we edit to Google sheet the fun part begins we start to analyze the project start to get Social Links from it I prep an input for the next module which just gets the website link the source and the name of the project we're using a different tool with a custom scraper that finds the Social Links so like the Twitter website maybe there's a documentation link that sort of stuff I got you that's a separate workflow that basically gets that information goes does the research on that specific like context in there yes exactly this is a very simple workflow because it's using a custom scraper that we wrote on Python and deployed it would just call the link get retrieve this information and return it to this module then we have a little step that just checks if the data is missing if not if it exists or not after that we is just an open AI module uh step to restructure the Social Links find the ones that we actually care about the website link GitHub link Twitter email if they have it just use AI to restructure the data and put it into Google Sheets again that's how we do it we just get the data if show the pr we just ask it to create a Json string with the output here and then weate Google Sheets with the information interesting part begins we preed the data for the next module we just get the Social Links and the name space and then we call the research the project module which is a custom workflow which is actually a research agent that we've built if I go to this tab you should be able to see it has access to three tools all of them are custom scrapers that we've built and deployed there's essentially three steps the first step is when the Social Links include a link to documentation this is the main source of information you want to get from the crypto project if Social Links here include the link to the documentation the AI agent will use one of the tools to get all of the information from the documentation it will use this automatic scraper to go to the website get all of the links scrape all of them and then go to each link and scrape all of them again it's very deep it gets so much information you built a recursive web scraper basically custom recursive web scraper is that in Python in Python yeah this for like deep analysis but then if there was no documentation link what the agent does next is it uses another tool just a simple HTTP tool that retrieves all of the links from the website sometimes as you see here there's no links it just not to return anything but sometimes there's 100 links and it needs to analyze all of them and see what links might indicate that they have documentation in them if they didn't have the commutation AI agent will find the link and use again this recursive scraper to get all of them and then store the content for later analysis if the commutation is not found what it's going to do is just use another scraper which it does not have recursion it works on like a manual basis we just provide a list of links that we want to scrape it will scrape all of them and just combine the data into one big chunk of text it uses this tool just provides the links understands what links to use and then just stops in case that there is no links provided like just in the case as we see here it will use the automatic recursive Scraper on the main link of the website it will go through all of the links and scrap all of the content there sometimes you know this HTP tool that retrieves link sometimes it does not work so we have just multiple ways for the agent to kind of Route itself in The Last Resort if like everything fails if there's no content the website does not work or something they deleted the website it will use a the overview page from the source that we're scraping it from and it will just get the overview information and stored as well the main workflow of this agent it's interesting I've seen a few other users when they're talking about prompting best practices talk about you know having an escape hatch for your AI agent it's kind of seeing that premise here you know like when things don't go right which when you're giving an autonomy you know could be the case Define what that looks like and some of the tips if you are actually creating problems for AI agents use AI to create prompts as well what I did is I just went to cloud and I explained everything that I wanted to do and ask it to create a good like well- written prompt just spits out a very detailed prompt that actually works because sometimes when you do it it's not necessarily what actually works for AI agents can I ask in Claude are you using the prompt writing assistant tool you just using vanilla claw yeah I'm just using the vanilla claw free Pro tip for everyone at home do you use LMS to help write your prompts yes all the time I use AI for pretty much everything right now guilty is charged as well if you want to look at the outputs here we see that we start the AI agent then it sees that there's no link in the documentation there's no documentation Link in the Social Links here then it uses the HTTP scraper to find all of the links on the website it sees that there is no content doesn't then reply but it will use the main link of the website use the recursive scraper and find all of the information here and this actually a good use case because as you see here it did not find any information I think that's because the website does not exist anymore or there's some issues with the website what it did in that case it us the manual scraper that's the edge case and just used it on the overview page of the Cur project just get that's showing you all of the three stops that could go wrong I think they did go wrong but still found information about it that's really good nice like modeling of the possible outcomes and like being prepared from I think that's like a key I've seen across these AI agenes like if you're going to make something that's autonomous you have to kind of understand the domain that it's going to be operating into account for those yeah after all of the information is has been scraped we use the analyze the project module which is also a custom workflow where we will summarize it and create a personalized um message let me show it as well we get lots of content and just text from scraping the website what we want to do next is somehow Channel this text and give it to the AI because what happens usually is if you scrape websites you will get thousands and thousands of characters and you usually cannot feed it into AI because of the context limit so what we're want to do is get the content that was scraped and just chunk it up into smaller size chunks what I'm doing here is just using python to chunk it into 20,000 characters and then just returning all the chunks in this case it's only one item because we just scraped little information because the website didn't actually exist what I'm doing next is just iterating on all of the channels that were created and generating a summary using AI on each chunk I'm just asking very simple and asking AI to just please summarize the given to content about the crypto project we're iterating on them and we then aggregate all of the summaries there's like 10 15 summaries and then we use a final Chad GPT module where we generate a final summary of the project this is output that we use to find the subcategory of the project or like the category of the crypto project we have like a list of categories and there subcategories and then just AI spits out the category that it thinks the project is we then use again a custle tool to generate a personalized message to the scriptive project team on Google Sheets we have a template for the personalized message and you have a list of vulnerabilities that I mentioned earlier for each subcategory we take the subcategory just map it in Google Sheets to find the vulnerabilities of this category and we feed it into AI to generate a personalized message essentially what this does just find subcategory gets the personalized message tempate and generates the personalization using all of that and then once everything is done it updates everything on Google Sheets with the project name all the Social Links that it found we can see the classified category the short summary the summary is pretty big for a project that didn't have any information only overview page is pretty cool and then it generates a personalization which I can show because it's for a client but just trust show me on that no problem I think what's really cool is there's complexity here of course but how I'm looking with 20 minutes into the recording and like you were able to explain this at a high level we even show some prompts and some details in that time like the relative Simplicity to what is being done automatically like that's what I think is super impressive here one question I had is I see you're using open AI nodes in the work for using open AI was that a conscious decision was it a c requirement how did you decide on Which models to use open AI is one of the cheapest models out there especially if you're using a GPT 4 or mini on the openi on the API level it's like almost free when you're using it not on a big scale that's kind of what we chose I think personally that Claude is a little bit better than chadt especially in some creative tasks but it's just way more expensive to run and chadt just does the job so that's why we chose it that makes a lot of sense and it's a trend I'm seeing is I think I have used the metaphor before yes maybe Claude will work for this use case but are you using a flamethrower to light a candle a flamethrower will definitely light the candle it's just like a lighter will probably also what influenced your decision to use na n to orchestrate this use case I heard about Nan before so I was using link. com for all of my animations but then I heard about it I checked it out I looked into the plants and then I heard it you can self host it I was like hm that's interesting looked at the plants looked at the capabilities of it and decided to switch because as a developer I usually like to write some code get some data and be able to just custom the workflows that I'm building I cannot do it with most of the other automation software. com zap here anything else and I can put a python code blog write some code to reformat my data as I want I can build Loops I can use AI agents which is also there very cool and can add the memory just use a bunch of different tools for building AI agents which most of the other software tools just don't offer and plus the fact that you can self host it and actually use it you know pretty much for free only pay for the hosting service that's very cool so firstly guys I do work for nadn but Nikita does not and he just said that and we're not paying him he's on this call for free is that correct all right so just for the record now makes a ton of sense one of the things I reflect on in Ed end is sometimes there's a few different ways to do something because it's flexible when you were going through the example you used a code node to limit the number of items coming out we have like a no code limit node where you just kind of enter that through the UI a lot of users sometimes they might not know about especially like technical users are like well I'm more comfortable in code which for us is like fine whichever way you like to use it but I think it was an ingenious way to use a few lines of code to get some more functionality I'm still new to n there's a lot of things that I need to learn how to use and learn different modules that there are because I wanted to write some code to remove duplicates from the data in the beginning that I showed to you and then I saw that you have already this module that removes the duplicates and it makes it super easy to use and then also the merch no and I don't have to write python again so there's just so much customization and so much capability that you can use with an very glad to hear that kind of feedback as you can imagine and I would say one really cool thing about using some of the generic chain nodes or agent nodes like there's a I don't know if you saw summarization chain and whatnot you can swap out the model there so what I'm doing very often is if I'm building use case where I don't know if Gemini is going to be better here or this model or that model I can quickly hot swap them and see for that step which is the best model thanks so much for your time now you're building actively I'm guessing you're ramping up to 2025 if folks like what they're seeing and want to follow along with your journey where can they do that I post started posting a lot on X so if you want to follow me there see some AI automation content I usually post some workflows that have built workflows for the clients and just AI automation advice you can follow me there I have a YouTube channel as well but not posting too much right now but I will I'm going to be explaining flows with n building a agents personal a agents that I'm actually building for myself right now and more of practical use cases very cool well I'm excited for that join the club it's very fun put out videos on YouTube and as a big thank you for your time I'm going to send you one of these flow gramar shirts feel free to rep it on your show if you like that's up to you so Nikita again I really love interviewing folks that are like doing client work solving real problems because again if you got paid for that work that's validation that it's useful right like that's the only next time you've got a cool use case to share let's hop in a call and let's share with the people the awesome stuff that you're working on sounds good sweet all right have a beautiful rest of your

### [16:50](https://www.youtube.com/watch?v=a1PS7ZvFg9g&t=1010s) Wrap up

week pretty cool right what I loved about this use case is that it shows how to use AI with not AI a lot of people are trying toay past AI everywhere and it doesn't make sense as you saw Nikita is combining multiple sources for his insights somewh properer based technologies that have existed for years other times he's using AI again when it makes sense to this is the first crypto use case I've shown on the studio if you liked it and want to see more drop a comment and tell me what you'd like to see I myself am thinking of doing a crypto Vault so basically put a bunch of cash in a crypto wallet AI agent owns a private key and you have to convince the AI agent to give it to you or jailbreak it so to speak and then we document those jailbreaks publicly as we iterate it on each time and increment the pot definitely comment if you want me to do that one cuz I chatted with my finance team about it and they did not like the idea Pro tip to anyone trying to convince Finance to use crypto don't acknowledge that there's a small probability that the Taliban in any case I'm Max catch you next time and Happy flow Grammy

---
*Источник: https://ekstraktznaniy.ru/video/15462*