# Community Hangout, Friday, June 28: United Nations OCHA & The Brain Tumour Charity

## Метаданные

- **Канал:** n8n
- **YouTube:** https://www.youtube.com/watch?v=W4-tqE6sgIQ
- **Дата:** 01.07.2024
- **Длительность:** 1:01:41
- **Просмотры:** 537
- **Источник:** https://ekstraktznaniy.ru/video/15637

## Описание

​This month we were delighted to have United Nations Office for the Coordination of Humanitarian Affairs (OCHA) and The Brain Tumour Charity as a speaker. OCHA will cover their Humanitarian Data Exchange (HDX) program and how it's powered by n8n, and the Brain Tumour Charity will show how n8n is now letting them handle larger events with less manual work.

​As always, started off with community and product updates, and we'll wrap up with a round of questions and answers.

Chapters:

0:00 Welcome
2:37 Agenda
3:36 Community Updates
8:10 Product Updates
14:58 Jobs
16:46 Session 1: The Brain Tumour Charity - Automating for Impact
31:35 Session 2: Automating the Data Quality Assurance Process for UN OCHA's Humanitarian Data Exchange (HDX) 
53:10 Questions & Answers

## Транскрипт

### Welcome []

hey everyone Welcome to The Neden Community hangout I'm your host today Max I'm taking over for Bart um so bear with me um as this my first session we're running and we've got some fantastic guests today um speaking and I'll get into that in the agenda in a little bit um but just please do keep in mind that we are going to be recording the session so feel free to keep your camera turned off if you'd like to stay Anonymous it's no problem we're just happy to have you here um and please do make sure to mute your microphone uh we will have Q& A throughout these sessions so please do post your questions in the chat and we'll make sure those get answered this just helps us kind of run a smoother session and as you're joining um we've got an icebreaker question which is uh what's an end feature that you recently discovered feel free to share in the chat we're always so curious to hear or perhaps something interesting you've built with end recently it could be fun useful hopefully both um but yeah we're always interested in learning what you build building with n um and yeah as I just mentioned we'll be doing Q& A in between the sessions so feel free to be sending your questions in during a segment and then after the segment we'll have some time for Q& A and if you have some miscellaneous questions perhaps not related to the talk feel free to post them we'll ACR those for the end of the session and we'll answer as many of those as we can uh with the time so we'll just wait another minute here see some people joining from uh from the queue um and as we're waiting to kickoff maybe another 60 seconds uh again what's a feature in idend that you've recently discovered that you found useful it's always useful for us to know and then let's just F the lobby let's see all well we've got 24 of us joined already so I think we can kick off at least let you know what the agenda for today is going to be um so we have two guest speakers today um we've got some very interesting use cases from our guest speakers uh one's from the United Nations um and they're going to be presenting uh some interesting programs uh or some interesting things that using NN for there I won't tease too much now for suspense um and the same from the brain tumor charity uh so it's always nice to see NN doing some good as well or empowering those who are doing good um and so if we go to the next slide for

### Agenda [2:37]

more formal agenda so I'm going to kick it off with some Community updates we've got a few different programs and things happening that we'd like to uh inform you about um and then Julio uh from our team is going to kick it off with some um updates from the product we've got some exciting new things that to be happening in NN he's going to detail that um and then we've also uh then got our two talks from the brain Tuma charity and again from un's humanitarian data Exchange program and then a community chat at the end um and just to remind everyone to please post all your questions in the chat that could be a general question and we'll group those for the end or a question about the current Speaker that's talking we'll have a little Q& A segment after each uh chat is over or after each talk is over rather okay so we'll get started with

### Community Updates [3:36]

the community updates and so the first update is about our ambassador program my colleague Bart has been uh working on launching so we're currently still onboarding new ambassadors after you apply we'll invite selected candidates for interv uh for a video interview uh we want to make sure there's a good fit there um and then right now we're at seven ambassadors so congratulations to all of our ambassadors that uh joined to the program you our first ambassadors we're really excited um and we're now yeah at 7 and we're going to get to know them a little bit better on our new and you can get to know them sorry a little bit better on our meet the Ambassador page which is linked here um and Lis has already hosted two Zoom Hangouts for our Brazilian community so congrats Lis thanks for getting that started and hopefully by the next update we'll be able to tell you of all the other exciting uh events and initiatives that our ambassadors are running around the world uh so thank you very much to our 7 first and here's to many more um but yeah feel free to apply um through the link here it's nn. ambassadors meet and then for the ambassadors program we've got upcoming events so there's one in Amsterdam near the end of uh July looks like there's not a fixed date for that yet but on nn. communityevents this is our Hub uh for all events so you can get uh up to dates on those ones and future ones including these from our ambassadors um and then also in sou Paulo in Brazil we'll have an event on July 8th and the 22nd again through this link you can uh get more details on that and yeah we're taking applications um uh one moment yeah so we're taking applications for ambassador program uh so if you think you're a good fit uh if you think you have a passion for NN and you'd like to nurture your local community uh then apply we'd love to hear from you um and the next update from the community is that we've uh now sending a monthly newsletter so thanks again to Bart uh and the team for putting that together um it's got the latest endend developments updates from our community as well as featured workflows from our community most of which are going to be workflow templates that you can set up yourself until we can customize um to start receiving our newsletter go to nn. newsletter and the next one should be sent out next week um and so it's just another great touch point to stay on top of all the many things that we're doing or our Global Community is doing and the next exciting Community update is that we have launched a tutorial Hub on our community Forum so you can go over to community. nend. io it should be a link on the primary left menu and here we're aggregating all tutorials uh in various different languages um from our Neden community so this is not our official Neden YouTube these are all fantastic videos be setup videos and tutorials and how-tos uh that we are getting from across a global community um we wanted to highlight that because I think a lot of you know time and effort goes into uh this sort of content um and so you can go over to community. nn. to check that out and uh keep in mind that you can uh again select by language we've just started this so obviously it's going to grow um and you can also add your own tutorial so if you're creating uh content around NN that you think is relevant in whatever language it might be uh you can also submit minut uh to uh this new repo um and we've also added two comprehensive free training courses we've got a 2-hour basic course and a one and a half hour Advanced course and this you can find on our YouTube um and should also be um linked uh in these slides as well after the session um so those are some very interesting courses even if you've already been using EDI in for a while um there's some really useful tips in there um and let's see if there's any question questions for the community updates looks like we've got none for now if a question comes up just feel free to ask it in the chat and we can get to it next um and so now it's time for some product

### Product Updates [8:10]

updates from my esteem colleague Julio uh he's a senior product designer on our nodes team here at NN uh I personally worked very closely with Julio and very excited uh to hear from him on the interesting things that we've been uh working on in the last month Julio are you ready to present yeah ready can you hear me well yes I can hear you I can see you all we need now are you going to be sharing a screen as well no we can uh go through the slides okay fantastic all right please take it away yeah it's just for slides so it's going to be quick um yeah nice to be here and meet you all uh people uh as said I'm uh designer in the notes team and today I want to show you a few things that we've been working on lately and or just release in the latest few weeks uh it's not everything that we added to n 10 but some of the things that I like the most so kind of a little biased list here so we can move to the next slide starting from the first one uh we improve the input panel the user experience in the input panel and we we've observed in our usability testing and talk with our users that some of our users have problems in using the data from all the previous nodes in their workflows and so we ask oursel how can we help users discover all that chain of data that is available there and so we did uh two main uh new features to main improvements here for the Json and table input panel you can see there we mve the input drop down inside more inside the panel and we also added the icons of the previous notes so it's more evident it's more visible that this is the list of the previous notes that you have in your workflow and for the schema view we did a more important update there because now in the schema view you don't only have the list of the fields available in the previous node but you have the fields of all the uh previous nodes in the chain so you have we have a very higher visibility of all the data this is available in that note that is coming from all the previous nodes in the chain so hopefully this will really help uh you guys and our users to access all the available data in n 10 more easily and probably we hope that will make things easier if we are using it already and we are happy with it so hopefully you will be happy of this new stuff too uh next slide autocomplete okay we've been uh working a lot on this autocomplete panel lately you probably already noticed some stuff we kind of uh improve the overall UI here we introduced examples we create a more like IDE experience if you if you want so more close to what we see in other softwares and lately what we've been doing in the last couple of weeks we added more methods and descriptions and examples so we keep improving the documentation that you can access in the out complete panel and also now the autocomplete is available also when you hover with a mouse on a method and also when you are in the card Ser between the parentheses okay so it's not just available while you are typing and you have the wall autocomplete window but if you want to uh check the documentation of any method there you can just over with your mouse on the method and you will have the aut complete their appearing and telling all the information about that method uh next one yeah this is also something that was highly on demand from our user base so we finally introduce the slck trigger note uh so it's available already uh it supports a bunch of events more importantly it supports the event of when a message is is sent on a slack um application and yeah it's there uh just check it out check all the available methods and let us know if if it's good or not uh next slide and last but not least uh we also spend some time on the dark mode which we see a lot of our users using we when we release the dark mode uh we saw a lot of our users switch into dark mode and we see a lot of screenshots in our community which are dark mode so we say okay we should really uh spend more time on the dark mode and what we did we really made the dark mode darker so that's what we actually did so we made the secondary buttons darker the secondary buttons before they had a white background so we kind of really uh too much highlighting from the background so now they have dark background with a white border so they merge much better uh with the overall user interface and we also made the nodes uh with a dark background again the nodes before they had a white background so they were really too much standing uh out from the interface they now merge much better with the interface we also uh work on some icons we now have icons for dark mode and we also improve some other icons in the in the wall application um some more work will come with the dark mode uh so it's something that is keep going and we want to improve it uh in the next weeks as well um this is it for the main product things in the latest weeks uh if there is any question I'm here well first off I just want to say thank you uh for presenting these fantastic updates know our users are going to absolutely appreciate all of them especially I think with the dark mode um I personally was really waiting for those icons to swap and um there's a lot that have to happen on the end and end team under the hood to swap out a lot of icons to make them dark mode ready so thank you for all that work uh for making that happen um we're going to see if there's any questions right now uh for Julio looks like there's aren't any at the moment if one Pops in uh feel free to post it in the chat and we can get to it later um and for this next

### Jobs [14:58]

section um we're going to talk a little bit about some of the career opportunities at nnn right now cuz we're growing it's actually very exciting had one of my uh first moments actually at the company where so it was the new Joiner and I wasn't exactly sure uh what their name was it's kind of an exciting moment um when you're sort of growing and you've um getting a lot more Brank mines to make this beautiful work for automation product um so on that note um there's a bunch of new roles that we're hiring for across many different departments from engineering uh to product and design uh Etc um all of our jobs are remote As long as you're in the EU Zone um and you can apply over at nend. io careers um on there you can also get some details on the various perks uh that we have uh me personally I'm a big fan of our open source contribution so each employee gets a100 a month that they get to decide uh which open source projects they like to support so it's personally uh very meaningful to me um I think a lot of us over here at Edan um but it's a fantastic place to work I've been for four years and I highly recommend taking a peek um if you think you'd like to take on a new challenge uh building uh loving work for automation tool um if there's any questions around our careers again ask them in the chat and we can get to them um but I'd love to get to our presentations so we can give our speakers plenty of time to share their exciting stories and also leave uh lots of room for questions for those um so again if there's any questions pop them in the chat and we'll get to those um so for our first presentation today um we've uh got a presentation uh from the folks over at the brain tumor charity it's a ukase based charity

### Session 1: The Brain Tumour Charity - Automating for Impact [16:46]

dedicated to funding research raising awareness of brain tumors reducing diagnosis times and providing support and information with people with brain tumors their families and their friends today Ibrahim keg head of it and CRM will be talking about how NN enables them uh to make an impact in their very important work Ibrahim can you uh hear me and you ready to begin good afternoon um max I can hear you um you happy for me to start sharing my screen yes please thank you and once the screen shared please Ibrahim take us away and let us know what you've been doing with nnn brilliant and firstly thank you for this opportunity to be able to present to talk here really appreciate that and again to talk about the charity we love talking about the charity and so my name is Ibrahim kada I'm the head of it and CRM at the brain trumer charity we're a UK based organization and some of the things that we do um let's see this so who are we we're the brain TR charity we're the biggest brain trer charity in the world and we focus on research and we campaign for better um diagnosis better care for people with brain tumors but we also provide them support within the charity as well and some of our values are um our values are being Community First so everything that we do should be Community First collaborative bold and Innovative the reason why I highlight that is um na10 fits really well into what we do and what we've been able to deliver with some of the projects that we have recently launched with na10 so why did we choose na10 and from around the time of the P pandemic and unfortunately we had to downsize so what that meant was the integration related projects um to our CRM system Salesforce were um achieved with if you will in enem means necessary within the current skill sets that we held at that particular time or the tools that we had available and this meant that there was no documentation there wasn't any point of reference and people didn't have confidence in the integration related projects or any automation that we had in place um when we looked at na10 I looked at it personally it was a source available solution it has a community the training was available and the documentation was Farah I ran it personally um within my house with home assistant but one of the challenges that we've had was um we had integration related projects stacking up now we were a Smalls siiz team we're just building the team up and it enabled us to build a proof of concept relatively easily and on top of that there are quite a lot of Enterprise features like versioning and you can add multiple users and right now you can bucket the projects together and we're making use of two Factor authentication but there are also like additional features like um single sign that we want to look into and what else you guys got log streaming that we want to have a little bit more of a play around with see what that would mean for us so now if we were to Deep dive into the projects um the Twilight walk is the brain Trum of Charities um Flagship event so we do this once a year it's a 10 kilometer is it 10 kilometer or 10 mile walk around London as highlighted on this map so we did this in 2024 with the use of na10 we've raised over £400,000 and we've raised great amount of awareness with it so the way so we did the Twilight walk previously we had always sold the tickets by event Sprite where people had the opportunity our community had the opportunity to buy purchase tickets and also um donate money to the charity but we dealt with that data we've exported it from event bright and we processed those tickets manually onto Salesforce our CRM system and that took and it was calculated by our wonderful events team it took them eight minutes per ticket so about um what's that about 160 hours to um process all those tickets and there was a great value to be had here there was an opportunity to and the business case was there it was just shouting at us we wanted to be able to um automate this as swiftly as possible as effectively as possible but also give the events team the opportunity to focus on the event so that they can provide better support for the community and with it just that we're able to use web hooks within event bright to fire off the data to na10 and there we're able to receive the information we're able to transform it and we're able to map it uh to send it across to Salesforce the nodes were already available and once in Salesforce were able to um have a look at how we can support the community we can um we were able to bring the on to marketing Cloud so email Journeys across and everything else that the Char does on its business um on its operations this was a great success for us and in fact so much so that we are currently speaking with the events team to be able to standardize the integration from event bright so any particular events that we have going on in the future to have a template setting with a default way of integrating to na10 so that everything comes into sales force as a unique campaign or unique event going forward and also to touch back on that particular event that we've done for the Twilight walk it was a highly collaborative event and so it was a IT team it was a Salesforce team it was the events team we also um got to work with the marketing Cloud team and number of other groups and individuals within the charity to um make this a reality it was a quick turnaround it was a very effective turnaround um it enabled people to see how successful integration could be done that was one of that was the initial and first project that we used for uh with nasn secondary one and now this um what we see over here and the Brain truma charity proposed that we should have a national brain tumor strategy and for that we've had a petition with uh the hashtag that it's a no-brainer to have a um to improve the um brain tumor care across the UK and we were able to have a petition and a lot of people signed up and the reason for that is um within the UK every day 34 people are diagnosed with brain tumors we wanted to raise their awareness we wanted to be able to provide better support for these people so the way to do that our wonderful um policy team within the charity they came up with a um this petition approach to be able to get those signatures with a um open letter to be able to hand over to the um local Ministers of parliament so people who look after the Healthcare in the UK to be able to hand that over to them and say look there is a gap in brain tumer care this is what you need to do we um so let's talk about the infrastructure so impact stack was a tool that we Ed it's an embedded system within our website people had the opportunity the community had the opportunity to be able to go into our website fill in the details and said yes we support the national brain tream strategy we had the signatures contact details and again it um fired across to na10 via web hook we received it transformed it mapped it across the Salesforce and when we received the data we were able to communicate back with our um Community immediately we didn't have to download the data wait for um week to be able to download the CSV tidy it up and then manually load it into Salesforce whenever people signed up um on impact stack on our website we were able to push that data to Salesforce with the help of NN to be able to take those meaningful actions now our targets weren't very high for this we're thinking about 10,000 will be great over the course of a month we had this launch on a Saturday morning I was the first person to sign it because I enabled it signed it and then over the weekend we had about 20,000 signatures so n1's backend infrastructure dealt with this on an Amazon instance it worked really well and over the course of about two three weeks we're able to get 52,000 signatures supporting a national brain tumor strategy so this has allowed us to make an impact and as a UK based charity we focused globally but this has enabled us to um highlight some of the concerns that we have in um brain true care and it was a wonderful opportunity to have such a tool we're able to better communicate with our charity were able to um you know support our community better but also the marketing teams were able to get involved um so that we can use a social media sites to be able to gather these signatures so there was quite a few different Technologies at play here but na10 was the key tool in the middle that orchestrated the transmission and the transformation and the loading of that data to tools it was a great success and so what's next um as a team as a growing it team um Salesforce team technical team we are we have more confidence in delivering such automation based projects um we are we are having to collaborate due to the success of these projects with the other departments to deliver further projects and um it is all based on our impact so we are a community first charity and everything that we do should impact the Char so since um these two particular projects we've been able to use NN further for um acity which is a again a scheduling tool that enables a benefits Clinic system to help people who are again impacted by brain tumors to speak to professionals on what sort of benefits they will be able to claim that they are entitled to and we use a thirdparty volunteer management tool we also get the data from that to um na10 via um API calls and again we push that data back and forth to Salesforce with marketing preferences everything else and also our recruitment is like nend is pretty much touching everything that we have currently um our recruitment tool n10 sits at the middle of the app where someone applies for a job at the brain Chom charity and if they are successful we are then able to alert our um Talent Department people and culture department so the human resources department to uh let them know that they will be um continuing with the job and it just makes things so much smoother we're a Smalls siiz charity aiming to do big things and na10 has been key in that in delivering on bigger impact so we are really happy with how things are going with n10 some much so that we are getting interest from other digital teams so people who work with websites or people who are dealing with projects on what they could use um na10 for with the rade available nodes with the trending that's already available with the support from the community to be able to deliver more it's one of the best technologies that I've had the opportunity to be involved with for the past um at least couple years so yeah big fan any questions as we see if there's any questions that came in um I'm going to switch to with got a slide for that as well on our side so I'm just going to switch but Ibrahim thank you so much uh for this talk I uh for me personally I think um it's always exciting to see when NN you know saves people time because that's the precious thing that we all have I think it's particularly uh rewarding to see when it's helping people save time and there those people spending that time to do good right because then it kind of helps them scale that impact um and I can only imagine how draining it would be to have to you know spend 8 minutes per ticket and move that over even though you know you're definitely going to do that this these are donations there people that like to help out um but how now that time can be spent on far more useful things that humans uh particularly are good at um so let's see if there's any questions that came in I'm gonna quickly refresh here um one question I do have actually Ibrahim from my side would be um how did um you know when you were picking a an automation tool I assume you there's other products on the market s Python scripts and whatnot what made you uh choose n8n as your solution as this kind of uh automation brain for your organization um so we came from a position of like I alluded to earlier on um automate or integrate with any way necessary um any way that was possible but um like we work with Salesforce that's our CRM solution so we could have gone the MU software or had to look at um something like zapia but the fact that um na10 was Source available and it had that Community back in behind it I could just fire up in Via Docker in a virtual machine to be able to put that proof of concept together to be able to qu um such different data sources um it was very easy it was very Swift to be able to on board but also showcase it to my team showcase it to the IT team and Salesforce team so of people who will be able to pick up the challenges and be able to address it and it was a great success and once we got to that stage that this works it wasn't we had no other questions we didn't want to explore um moft Zia and yeah we've been with 10 since fantastic thank you for sharing I'm sure everyone at T then particularly appreciates hearing uh how much you appreciate our product um so next presentation that we have and again if you have't any questions for Ibrahim please do post them in the chat we'll be acre them for the end here um but it's my pleasure to introduce our next uh

### Session 2: Automating the Data Quality Assurance Process for UN OCHA's Humanitarian Data Exchange (HDX) [31:35]

guests um and this is from the UN and talk is on automating data quality assurance process for the UN oha's humanitarian data exchange so the United Nation's office for the coordination of humanitarian Affairs supports humanitarian organizations to respond effective ly to the needs of people caught in crises to understand and analyze their needs and to mobilize International assistance they provide tools and services to help humanitarian organizations show that no one affected by a crisis is left behind today and today they're going to show how nadn helps them with quality assurance of data uh that's collected in the field so I think it's a particularly interesting uh use case I'm really excited for this one um and I'm just going to do a quick check with Alex if you're ready uh to present yes uh I hope you can hear me well and I'll just try to share my screen if that's okay yes fantastic so um Alex Gartner is a software engineer for the humanitarian data exchange he's part of the United Nations office for the coordination of humanitarian Affairs and um he's based in Bucharest Romania and he's going to take it away and show us how they've been using NN um to help with some of the problems that they've been having Alex take it away thank you so much for the nice introduction Max and having us here today um as was already mentioned uh we're going to discuss about how we used na10 in oa's humanitarian data exchange platform and how you used it to automate quality assurance um yeah just quickly to go through the introduction I'm Alex and I'm a software engineer and my colleague Yumi and though uh she was here in the chat maybe you've seen her unfortunately she couldn't stay so I'm going to do the entire presentation today um just a couple of words about oa's uh Center for humanitarian data um so the center um focuses on increasing the use and impact of data in the humanitarian sector there are uh a couple of work stream that uh work streams that the center is focused on like data science data responsibility data literacy but uh today I'm going to focus on the data services works even more specifically the humanitarian data exchange or HDX in short uh which is an open platform the idea is for the humanarian community to a to be able to share data and um to access data in crisis and the idea is to be able to easily find use analyze the data that is distributed via HDX um to note that HDX is not the source of the data we are like a data Hub uh and also so HX kind of positioned between uh data contributors and data consumers uh and it's based on secen which is an open source um tool it's a general data management system for um sharing data and here at the end of the slide you can see the URL to HDX if anybody is interesting to look um just to give a bit of context these are kind of like the types of data that are being shared on um HDX uh so it's data about a context of the crisis so here can be information on like populations of country or health facilities in the country education facilities uh then data about the people that are affected and here can be information on like refugees or internally displaced persons and Al and the third type is data about the humanitarian response and this is data like three W that's what how we call it like who is doing what where like what organizations are doing what in what specific sectors and in what locations um or yeah another example is financial tracking data and to just give a example of how a data set looks like on HX here we have like Nepal Health Facilities um maybe first thing to say here is that on HDX data that is published in units called data sets and so Nepal Health Facilities is what we call data set and a data set has uh resources um and resource can be geospatial data like in this case um it can also be like tabular data which uh will be in the next example I think and the idea is to be as machine readable as possible this data or to be a for users to be able to use them in data analysis like maybe script or Excel pbjs whatever uh they're going to use but besides the resources the data set has a lot of other fields like title description and so on um and this all these fields and the resources are things that are being checked in this QA process that I'm going to talk more about in the next slides so each of these fields needs to be check to see that it actually matches with the rest of the DAT set this is just another example just of time I will skip over it this is internally displaced persons but here you can see that the resources are uh spreadsheets in this case and this is another example for 3w data um okay so now to the actual problem that we're trying to solve which is quality assurance um what are the main objectives first and foremost is to prevent sensitive data from uh appearing and being shared on the platform to avoid personal identifying information like um name email phone numbers and so on to to be shared secondly as I mentioned for the metadata to be accurate so that title description actually match what's in the resources uh to make sure of things like no broken links um and here maybe I should say that resources are of two types in HDX they can be uploaded loaded directly to HDX and then broken links is not really an issue but they can be also external just points to external systems um and then uh it makes sense to check where these problem links and as I quickly mentioned before we're interested in specific formats that are easy to process uh and lastly the whole idea is to decrease the time spent in this QA process so we have a team uh of QA officers that deals with this and we try to uh make this process as efficient as possible um okay now to the solution like we actually started with a customade solution uh but it wasn't easy enough to use so uh we wanted to um have a more robust um solution to achieve this so what we what was decided was that we're going to use jira for this so every change every um data or metadata change will be transformed into jtic tasks or subtasks and how this happens is that every change is actually transformed into events that go on something like an event bus and this um events that needed to be somehow processed to be transformed into tickets and for this we needed a solution that can talk to redis because that's where the events actually live they needed to talk to Google spreadsheets because we have some fairly complex Google spreadsheets with um the avail ility of the QA officers their time zone their vacation times and so on and to slack so that we can get um feedback on slack on how the uh updates are going and also again to htx to get more data for the tickets and I'll just quickly cover it I mean this is just a j Board of how it looks now um but and this is maybe a bit more interesting this is uh a jirat task example this corresponds one to a data set that was modified or created on HDX and each of the subtasks that you can see are um fields from the data set that have been modified and each of these fields is a subtask that then a QA officer can look at and say if this is okay or not and this is how sub dusk looks just very quickly it contains the initial value then the modified oops and then the modified value like you can see here maybe but then if before the time that this task is completed more changes come they will appear as comments for example um so why did we decide to choose na10 for this and why we realize that na10 is a good fit um first of all it's a no code low code solution and um that would allow people that are maybe less technical in the team to have an input on this also the there are other teams in the center that create pipelines and we wanted to like provide an example for them so that maybe they will use them in they will use any in their own um projects they have the visual representation of the logic was actually very helpful in many ways especially in presenting to other people and explaining what's going on and we had very good feedback from uh both Technical and non-technical people on how easily they understood um what's happening the visual execution results of course are super helpful when you see the data coming in going out and so on and as I think I've already mentioned that easy integration with other services was really nice to have um yeah and now to get a bit more into the weeds um this is how our main workflow looks um looks a bit complex uh but I you you don't need to squint understand what's happening I'll zoom in on the important parts don't worry um I'll just say that it starts all via web hook and I will say a bit later why uh how the web Hook is called uh but just to say two words about the beginning this column deals with merging the events together so there are many events here that come for a single data many changes set some of these changes are maybe no longer needed because they were superseded by other changes for example if a description was changed multiple times in the span of a short period of time then we can just drop the changes also in cases where we have lists that were modified a couple of times we are only interested in the final result so here is where all that happens and then in the next column here is where we actually prepare the content for jira the all the reach text formatting and list and populating the fields um and then the rest deals with actually creating the tickets and as I said I'm just going to zoom in on a couple of them this is the jira creation part the where we create the parent task the Ted Corr resp to a data set for first we get the uh schedule of QA officer like who is available to actually deal as fast as possible with this ticket um then we keep a bit of State in redis so that we don't assign all the time to the same officer um and then we just create the data set sorry the tickets corresponding to the data set there's also a weight node because uh we have to be careful with not hitting Jura too often because there's a limit and this part deals with actually creating the subtasks the lower part below here deals with either creating a new subtask or if there is already a subtest created for that uh specific data set and field that was not yet solved then just comments are being added to the existing um subk and here also we show that the parent tickets needs to be updated for example if the title of the data set has changed we need to change the parent tiet as well and here the top part is also in the interest of making life easier for QA officer if the data set was deleted then it no longer needs to be checked so we just Mark the ticket as failed okay and now there are a couple of special considerations that are more specific to our use case and I thought it might be interesting to bring them up they are I mean we are surely not na1 experts um so the solutions that we came up with might not be the best ones so I'm really curious on feedback on this um for the main workflow that I already showed um There's a constraint that uh there shouldn't be several executions running in parallel because then it could create the several Jura tickets at the same time for example for the same data set which is not what we want on the other hand we have three production and intent instances so they might actually run in parallel so uh we came up with a locking mechanism based on reddis so it doesn't happen the second one is we don't have direct access to production just this is just one of the rules of uh the organization uh nobody has direct access so we can actually access the UI but uh what we found useful was that we have a Jenkins job that exports the data execution um and then we can have this all the outputs in a of each node in a Json and that's actually very helpful um and third about testing we went with having some mock noes that simulate the communication with the external services like uh communicating with uh Google spreadsheets or with um jira and we have a python script that replaces the real nodes with the mocked nodes and then we run the execution via command line in GitHub actions and then we look at the output and check that it's what we expect it to be like the output is the batch operations run the number of times that we are thinking that it should and so um yeah this is the start workflow this is the workflow that um basically triggers the main workflow and here is the lock that I was talking about the first thing it does it gets a lock on redis then it does a couple of other things like grouping the events by data set so that when it gives the data to the main workflow they are all events that pertain to a specific data set and it also make sure that the data is sorted in by the event time and then at the end of course it Rel thises the lock um and we can see that it also notifies on Slack uh if it didn't manage to acquire no this is your workflow I'm in the interest of times I think I'm going to skip over there's nothing that interesting here just that if an error happens and the lock was held it releases the lock uh and this is just a screenshot of the mocked notes where we mocked the the nodes like uh as I said the one that communicate with jir or Google spreadsheet um yeah so all in all uh n10 brought a lot of advantages that I already went over for our specific use case what uh would be nice to have to help us would be a view only role um for na10 and that's just because we're not allowed to make changes directly on production so if would have a VI only rolling we would actually get access to the UI which we don't and second is just because we have so much uh JavaScript code in the uh in the workflow some uh additional feature related to GS node especially may maybe there is and we just didn't realize um features related to reusing JavaScript code like reusing functions that would have been helpful and yeah conclusions so basically it was a success it really helped our QA team to have this new process uh everybody understood how it's working uh because it was easy to explained there's a clear decrease in time needed to finish tasks um easier to assign tasks to people than we had before um easier management and also very important there's a history of how each task was solved which was very helpful for them because they can look back if they're not sure about uh a specific case they can look back in history and also what everybody um was saying that was involved in this process is that now they have a better overview of the changes that happen the data changes that happen on htx uh and yeah this is just uh to show that the median time before was around 8 hours and now the median time is around 4 hours 49 and if we're looking just at weekday that also decreases and if you look at weekday and inside office hours that's even better thank you so much and sorry it took a bit longer to go through the slides but if there are any questions you know um no apologies needed thank you so much Alex uh it's I think really interesting also when seeing you know larger organizations that have uh constraints and compliance and whatnot how they're applying in at then uh in a world where there's some fixed rules and some things to deal with uh very interesting so thank you very much for that we've got a few different questions

### Questions & Answers [53:10]

acred from the different chats um in the interest of time I'll just start with the first question uh to get through all of them so we won't do it by chunks if I could ask all the speakers to please stay uh until the end here it's about seven more minutes um but thank you again Alex and thank you again also IIM for both of your chats um you know product updates it's obviously interesting to understand what's happening at team andn but I think again we're always so excited to understand what you guys are doing with the tool that we're building um so the first question that we have is uh for Julio I comes from Felix um the question is I prefer using data only from the last node because using data from earli nodes gets messy quickly the main issue that is that it's unclear which Expressions need updating in which nodes when making changes to nodes is there something planned on making it clear when nodes get edited uh there uh that there are other nodes directly affected Julio um Felix you want to provide more context for this um no okay I can answer anyway uh at the moment we don't have anything planned about that uh but of course we always listen to the community so if this is something that is coming from the community we can probably take care of that uh one thing I like to mention is also that um if the modification to a previous node is that is just changing the name of the node this doesn't affect the functionality of the workflow so if you rename one of the previous nodes and you're using data from that nodes in one of one expression down the chain this is not going to break anything if you just change the name of course uh you can break something if you really change the logic of the workflow and maybe a previous node is handling different data then in that case something could break uh we don't have anything planned to deal with that at the moment but uh thanks for bringing this up maybe we could consider for future developments and feel free to get in touch with me Felix viaa email or on the community you find me on the community if you want to add more details to that thanks a lot Julia for that we'll move on to our next question um this comes from Alexander um I believe this is still for you uh Julio uh I recently learned how to automatically install npm packages in a Docker environment without a custom image using n ITN and the N ITN init node um is there now a feature to easily install npm packages I have found at least one table installed packages that seems to indicate this uh the quick answer is no at the moment is not this is actually something we've been discussing internally and thanks for sharing the workflow I downloaded workflow and had a look and that's very uh interesting way I didn't even know it was possible to kind of load npm packages that way thanks for sharing that and we've been discussing this especially because I raised the question on the community asking for what is the what are the most needed feature for the code node and the number one request was to be able to use other packages and but there are some security concerns that we have so we is very difficult to allow to load any npn package there because there are some we have to be more straight for security reasons but we've discussed we've been discussing it internally and so let's see maybe since we are now working on overall in the code node uh we are still kind of open and evaluating some opportunities there to make it easier to load npm packages uh the table that you're referring to is probably the list of packages that are already used by the application and we are also considering if allowing to use those packages uh but we are not sure uh not sure about that because it's in any case it's a limited set of packages and what we hear from the community is that they would like to use any package they might need so TDR is stay tuned check the new stuff coming from the code node check the conversations on the community and let's see if we can come up with a easier solutions for that thanks a lot Julia for providing a little context there appreciate that move on to our next question so we've got a question from Jim for Ibrahim I've seen a lot of Charities picking up AI for various things you mentioned are there any plans to extend your current NN workflows to move to use more AI in the future great question thank you and I tried to answer the question in the chat I don't know if I was going to be able to stick around for much longer but I mean to put it briefly um we're a charity so we do hold quite a lot of sensitive data as well as our community we have data around our people who are impacted by brain tumors and their health related data so we need to ensure that we are able to you know um use solutions that go through a very diligent process or it's air gapped as much as possible you know data protection is key with all that aspect and the same applies with AI and we would love to have ai tools and the technology and the skill set within the charity to be able to build a business case to make use of um you know to extend tools such as n10 um but we're very accountable to our expenditure but we do have a wonderful people who want to volunteer their time and their skill set to the charity um so going forward if we were um to have the opportunity to discuss such Solutions absolutely we can definitely take that as a consideration AI has also been people may not allow this answer AI has also been a bit of a buzzword that's been gone around quite heavily and like working in the it um role and we get to collaborate quite heavily with other loads of people within the charity external to the charity within the community and I need a business case for the usage of AI and it's viability so if that was potentially there and if we're able to have a solution that again ties the data protection the cost and the usage and the Simplicity with a business case absolutely hopefully that answers it fantastic thank you for that answer and uh ibraim we've got some very exciting things happening at edit end with basically self-hosting and air gapped Ai and that sort of thing I think it's too early to share right now but I'd be happy to pick up that conversation with you I think it's some interesting potential there um so at the moment uh we're officially at time um so we just check if we have any other questions to answer quickly nope we don't um so firstly thank you everyone for your time today know it's Friday after hours um I hope you enjoyed the sessions as much as I today um our next hangout is on the um it's probably not on the 18th of June I would imagine um so we will uh post a correction to that Lis if you happen to know already please let me know um otherwise we'll be posting all of the correct information on nn. communityevents so go to that website bookmark it check it that's when you know when our Eves events are going to be in the future um but I know we're going to be hosting one soon enough um and we'll have some more exciting speakers for you and some more updates from the team on what we're working on um if again if you have any follow-ups on product ideas or anything you've been talking with Julio the community forums is the best place to have those conversations that's community. nn. um my name is Max and thank you so much for joining we're going to stop recording this is officially over we're going to hang out for a few minutes um and feel free uh to have a bit of a chat with us informally uh if you'd like so you can unmute microphone and have a more of a free conversation as we formally wrap up the community hangout itself so thank you everyone until next time
