‘Modernizing legacy IT systems with n8n, Agents & MCP integration’ - Berlin Meetup (May 2025)
13:08

‘Modernizing legacy IT systems with n8n, Agents & MCP integration’ - Berlin Meetup (May 2025)

n8n 10.09.2025 5 130 просмотров 97 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
André Lindenberg, Fellow at Exxeta, focused on AI-driven Software-Engineering, previously Technology VP and Division Lead. Re-engineering legacy systems is tough—but with n8n, you can modernize faster using AI agents, smart prompts, and context-aware automation. This talk shows how, plus a sneak peek at how MCP integration supercharges workflows. #n8n #community #ai #agents #lowcode #nocode

Оглавление (3 сегментов)

Segment 1 (00:00 - 05:00)

So h everybody my name is Andre. I'm AI fellow at Exitta. This basically means that I get to focus a lot on topics you know when it comes to application of AI using AI in all sorts of use cases and we're really focusing on Gen AI because this is you know like what all the conversation is about. Um but of course also AI and maybe let me say this quickly at Exeta we are 1300 people in Germany. uh we're doing mostly software engineering but also product building and yeah helping clients with digital innovation and all sorts of other things related to software development and this is also the reason why generative AI and artificial intelligence is really crucial and important for us and we heard a lot of interesting things already regarding NAN and new features that are upcoming and we saw this curve just a couple of minutes ago pretty much when I started with NAN as well I would say was a little bit earlier probably October nove November last year and the reason why we started looking at that was basically um we have been implementing gener use cases for quite some time to be honest um two and a half years now since it really started uh with chat GPT and at a certain point in time a lot of people in our organization said let's implement this and implement that and build something because we can automate something and I had to deal with a lot of requests in terms of can we put this into a docker container and deploy it to a cloud environment and something and at a certain point there was really you know like a lot of requests and a lot of software that has been built and we looked at it and said oh man there's so many deployable artifacts we need to maybe we need to find something different that was one driver I would say the other one was um it took quite a lot of time to implement automation for use cases and so we looked at various frameworks not just NAN also other ones and compared the frameworks and libraries and everything that's out there and yeah today we're here partnering with NAN the reason for that is pretty much that we think this is the best. So let's dive right in. A couple of links I want to share. So this is something I watched from AI workshop automate everything in HubSpot. This was you know like in November last year. we saw this video from the community and looked at it and said okay cool you can AI automate a lot of things with HubSpot and using NADN and he talks about a use case where you in CRM in HubSpot this is a customer relationship management tool for those of you who don't know where you manage the touch points with your customers and we have that as well and um he talks about summarizing basically the conversations with clients this is not our use case but it made me aware of the fact that we can use n for that then thought about automating in our hopspot the data quality assurance process because I'm really like in discussion with our key account managers and ask them to please provide all the fields we need to know what's going on please fill out everything and this is an endless story and is very painful and then we started automating this by uh doing some automation with nadm this is another video then we watched it's about automating pretty much everything also very interesting and it gave us the idea here to maybe also automate things like license management in our organization. Things like who's using what kind of license and in particular who's not why are we paying for those license and all this kind of stuff. So automating even more and this is one of the most recent ones we saw on YouTube. is about building advanced retrieval augmented generation systems with superbase and a mixture out of vector embeddings and structured data in a database to get better answers for you know like retrieval augmented generation cases. This gave us the idea maybe we can run a workflow in N8N for you know like building code racks because this is very relevant for us in software engineering. You have to onboard new developers to our systems have to help them understand the codebase even faster. And what I want to say with all these use cases, this is actually before I dive into the use case I brought with me today. This is a really cool thing is the community actually. And I want to use the chance today to say thank you for all of the posts and everything that's out there because it really helps us, you know, like to come up with the right ideas and check out the content. This is not everything that you find on YouTube. This is just, you know, a few videos. So there's a lot out there also shared templates and stuff like that. So what I actually do with my team a lot is I work in IT modernization. And when I say IT modernization, this is basically you have this very old legacy IT application. This can be for the older ones in implemented in cobalt on IBMAS 400 stuff, right? Really old transactional systems running in financial services institutions and in insurance companies. This is really those systems are really business critical and a very hard thing to do is yeah discuss with the owners of those systems to start modernizing the such systems and modernizing basically means finding ways to reimplement these systems in a more modern technology right and let's just wait two more years because then another guy will be here and needs to handle it or it's good enough or don't just touch it just

Segment 2 (05:00 - 10:00)

works and stuff like that so we see that a lot but I work in that field basically I really enjoy that And uh the way we do this and the way we analyze the systems even before using NN was a three-step approach called decompose, transform and synthesize. And this is what you need to do to tackle such systems. You need to decompose the systems and understand where the complexity is and how they are built basically. And then you need to transform the systems which is basically an engineer sitting down and understanding okay what can we do and what do we need to do and what does this piece of software actually do and what would it look like in a new technology. And then you need synthesizers. This is pretty much putting it back together and making it work as a system, right? You cannot just implement a for loop and translate all the source code because then you have lots of source code but not an application. So you need to synthesize and bring everything together, decompose, transform. Those are my favorite parts actually. Um transform is not just about rewriting it. It's about if you have old technology, it's really like you have this procedural language starting in the beginning, ending at the bottom, then you have the next file, dependencies in between, stuff like that. And then you need to move this to object orientation more abstraction Java frameworks spring and this kind of stuff. So it is really complicated. This is a workflow that I brought with me um that we use for example for translating store procedures into Java store procedure. This is database logic and we have a lot of clients that you know like have tons of business logic implemented in store procedures really a lot and uh finding ways to semi-automatically uh translating such stored procedures into more modern Java code for example is really interesting. What this workflow does is input file with more than 1,000 stored procedures and then we have a little bit of code. Yeah. Split the stored procedures into various parts and then we go over each one and translate. The magic is in with the LLM probably and with the prompts and it's about extracting entities and controllers and services and all of this kind of things. It's exactly as we heard from David. It's about you know like putting it into a structure and having an LLM somewhere having a little bit of code on in another place right this is really what helps us doing such kind of things it's nothing where I would say I can release it as a open community as you can just use this to translate every store procedure because it needs to be tailored to the specific case it can really uh yeah I think help a lot what we also did recently was evaluating whether we can use such workflows for Java software migration Java updates from Java 11 to Java Tech 21. This is something you frequently have to do and this typically causes a lot of work. Developers are very busy than doing this and cannot implement other features and we automating these kind of things also as well. This is where the software helps us. Another workflow I'd like to is a workflow we use for yeah generating code and this would be in the decompose transform synthesize it would be transform and we use that for code generation actually a bigger workflow I would say. This workflow gets as an input basically it gets what comes out of the phase of decomposition and in decomposition phase we resolve we have like custom software for this we build knowledge graphs and information graphs about an IT system and this workflow gets this knowledge graph as an input and then processes on this knowledge graph and what we see is it comes up with some kind of plan to transform this and then we have a transformation and after this step here what we get basically is kind of a procedural Java I would say and once this is done we do some automated refactoring. What we do when we have the procedural Java which comes out here, we go to another step and this is the next step in the workflow up there. We implement a refinement. This basically means we refactor procedural Java into a more modular version. And the reason for that is I try to help our engineering teams that build our software. Basically, I try to help them implement those tasks even quicker and faster and increase the velocity there. And when I provide them with code that comes out of this phase, they just reject it and say we cannot really use that. And so we have some refactoring and improvement of the code quality and can automate a lot of things there. As you can imagine, when you include custom trained models, this is also something we do. We include a local running koda models that we train on specific data taken from the system and provide it with more context and results get better. What I didn't talk about yet so far is quality assurance. Quality assurance is a big point and we have several mechanisms for that. I don't have it in the workflow here but we implement it manually and I was while David was talking about evaluation I was already thinking how to integrate this next Monday. I think it comes next Monday and we will integrate it. Why is it important? Because if you cannot really decide whether the Java you looking at is correct or not and complete or not or if something is missing or not right then you lose all the benefits of doing the automation up front because you have to look at the original source code and have to understand is it correct or not. That's why quality assurance is so important and it's crucial to do that. There's strategies to do that. Our goal

Segment 3 (10:00 - 13:00)

is not to replace the engineers. We want to help the engineers be 50% faster. Let's say 30% faster would also be good. Right? In my opinion, 60% would even would be even better, right? So, this is why we do the automation. If you listen to what you know, like the CEOs from Silicon Valley companies currently are talking about, then they're saying we're building this agent and we probably won't have to code in 6 to 8 months anymore. I don't think this is true for at least this kind of complexity we're dealing with here. Regarding MCP server, no meetup would be complete without talking about MCP server and A2A. I think we use that as well. I have used it even before the MCP nodes were released. When was it two months ago, a month ago, not so long ago actually. And how did I do it? I implemented a fast MCP server on my machine basically. And then connected it via code nodes from N8N. That worked also quite okay. But now we integrated the MCP server triggers and happy that it's there. And I brought a couple of things we use with me. What I really like and this is nothing for production is the CLI node because it's cool. You can talk to the agent, ask it all sorts of things. The CLI is a can access my file system basically and I can do all sorts of things because the large language model is kind of able to map what I want to do to the specific bash command for example and then execute. So I use that quite a bit locally on my machine here for work. What I also like is things like browser use because I can I don't have to think about every single case the user wants to do. I can enable basically an agent to do something it needs to do. Also important superbase integration I did not talk about memories so far we have the simple storage memory that's good for our use cases mainly this is for the session so while something is going on we need to keep track what is it doing and I need to have some information when I translate code for example what's even more important for our use cases is long-term memory so persistence and uh that's why we use something like superbase integration and then have databases behind because when I come back tomorrow, I need to know what I translated yesterday in order to make the right decision. When it comes to adding features to the application, I automatically translate. So, long-term and short-term memory is really crucial when it comes to good solutions. Closing thoughts. Yes, we could implement a lot of this stuff in Python. We're a lot faster in using, you know, this N8N to implement the workflows. I have the problem which I talked about with the hundreds of artifacts. when we implement everything in Python it's simply it gets a burden to manage everything what's also very important David said and is about user experience and I can really I must agree because this is when you present this kind of logic as I said it's crucial with these IT systems they are very critical business critical systems and when you present this logic this is how we do it and you can look at it and understand step by step what's happening and you can implement human loop features and all of that it really helps with the discussion of do we do this or not. So that's my last slide. Thank you very much.

Другие видео автора — n8n

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник