# React to real-time market data with n8n and QuestDB 📈

## Метаданные

- **Канал:** n8n
- **YouTube:** https://www.youtube.com/watch?v=gYc9-NRz75E
- **Дата:** 29.03.2021
- **Длительность:** 54:49
- **Просмотры:** 2,497
- **Источник:** https://ekstraktznaniy.ru/video/15844

## Описание

Over the last few months, there have been many ups and downs in the financial markets. Have you considered using automation to stay on top of your game?

With n8n, you can easily connect different services to make faster and better decisions and take action on events you care about. Integration with QuestDB means you can easily add an open-source time-series database built for high performance directly in n8n workflows.

In this webinar, you will learn how to build a workflow in n8n which monitors cryptocurrency prices, sends this data to QuestDB for time series analysis, and triggers Slack alerts in real-time.

## Транскрипт

### Introduction []

hey good evening everyone and welcome to the webinar react to real-time market data with n10 and quest db my name is tanya pond and i'll be the moderator for the webinar today i have with me hershell and brian who will be presenters for today herschel is a junior developer advocate at na10 he's an author ambassador and a mozilla representative herschel enjoys experimenting with new tools and technologies building cool stuff and sharing his knowledge with the community brian is a technical writer at quest tv who loves building create developer documentation and hacks on open source software python node. js ruby arduino swagger docks and cellular iot so before i pass over to herschel and brian a few housekeeping things so feel free to send any questions that you might have throughout the duration of this webinar via the chat panel on the right and we'll answer questions at the end of the webinar we also have a couple of polls that we'll be asking throughout the presentation so watch out for them also if you have any problems with sound or if you can't see the presentation or something like that please let me know through the chat and with that i would like to hand over to our presenters herschel and brian thanks oh i think herschel i think you're muted hello everyone okay so welcome to the webinar uh react to my real-time market data with ntn and quest db so there have been a lot of ups and downs in the financial markets over the last few months so in this webinar we are going to build workflows that will automate the process of monitoring bitcoin's value for real-time insurance you can modify this workflow to monitor the stock market as well so talking about monitoring brian uh how do people generally keep themselves updated with the ever-changing price of bitcoin yeah that's a good question i mean usually people who are checking bitcoin prices for maybe if you're doing this on the side of something that's an interest for you in terms of personal finances maybe this is something that probably involves a lot of manual effort and requires a fair bit of effort especially if you're going to try and get a little bit serious about it and build some automations if you want to code something obviously this takes quite a bit of time and you have to invest in figuring out how to interact with apis and grab the information that you're interested in and then of course react on that so yeah makes sense so it's more of a manual work if you want to not code your own automation for that right that's right yeah i mean there's quite a few resources out there for building various different tools based on i mean i think the python ecosystem there's quite a lot of stuff out there for people building automated tools but this it takes a while to to figure out what uh what the platforms uh are giving you back or what you exactly what kind of system that you want to build so this is pretty time intensive for us gotcha and that's where and it then comes into place so today we will build an automation workflow which will ingest bitcoin's value in a database will build another workflow that will use this value query there query the data and then you know uh send this uh send us this information on slack now talking about ntn n10 is a fair code

### About n8n [4:00]

license tool that helps you automate tasks sync data between various sources and react to events all via a visual workflow editor to use anytime you can either host it on your servers or sign up for edit and cloud so when you access your render instance you are greeted by an editor ui where you build your workflow now as you see in the image over here this is the workflow that we will build today now workflow is a key component of anytime a workflow can be started manually or it can be triggered by an external event so the workflow in this image gets triggered whenever we issue a slash command now if you see over here all these integrations these are the nodes in anytime which are another key component now these nodes are the building blocks of the workflow and currently there are 270 plus nodes in n10 which you can use to build your own automation so again coming back to this image you see that we have different nodes over here we have a node for quest tp which can perform a certain operations we have another node first slack which can post a message from slack uh motion message to slack get some uh information from slack as well so there are a lot of options over here so ryan uh brian so now that we know uh are introduced to n what would be the next steps involved in our workflows so okay we have to first get the data that we're interested in we have to store it somewhere and then we have to ask questions about what that data looks like so the perfect tool for this job is basically a time series database awesome okay uh so let's first take a look at the editor ui

### Nodes [6:00]

now when you open the editor ui you will be greeted by a couple of options over here and they give you a lot of flexibility so they give you the option to create a new workflow open an existing workflow even download a workflow and there are a lot of options and i would encourage you all to you know go ahead and check them out now by default there is a start node the workflow starts from the start node and the start node cannot be deleted talking about nodes there are two different nodes in anytime the first is the trigger node these nodes start the automation and supply all the required information to our workflow so for example uh in one of our workflows we will use an interval node that will trigger a workflow to run every second the next are the regular nodes which basically do the actual work so they can add remove and edit the data in the flow as well as request and send data to external apis so uh an example over here would be the quest db node the other example is the con gecko node so as mentioned we will use the interval tri interval trigger node to trigger a workflow to run every second i will add the interword node now by default it is configured to run every second but let's just say you want to run it for five minutes so you can just set the interval time in the interval field and select the unit two minutes but again we want to run it every second so i'm gonna change it back to the default values now we have uh now we have a trigger node the next thing we want to do is you want to get some values from the coin gecko node i'm going to select and one of the things i like about the coin icon node is you don't have to go ahead and sign up on coin gecko to create your own account and you know get those api keys so it makes it really easy to quickly get started now if you see over here we also have a lot of operations in the coin gecko node and we are interested in the price operation over here and we want to search the coin by id and there are options for base currency so this would be a bitcoin in our use case so i would do btc now you can add multiple base currencies so let's just say you are building this workflow to get information for ethereum as well so you can add ethereum as well but we i am just interested in bitcoin at the moment and the code currency is that i want to use a usb as well as euro now if i execute this node it will get the value from coin gecko so we see now the value of bitcoin in usd as well as euro and this is a bit expensive okay so the next thing we want to do is we want to use a set node that will basically set some information that we want to ingest in quest db now talking about this brian we have this data that would be coming every second right so for such kind of ingestion what kind of database uh should we use yeah so we should

### Timeseries data [9:35]

use a time series database um so this uh slides here about what time series data is um so time series data we're all familiar with it so this graph here on the left is apple's share price over time um that's all time series data is basically we're just looking at the changes of some value of some metrics over time so the difference there between traditional databases is you would just store and receive the latest known value for something so let's say you had a customer and you changed the customer address and then you want to fetch that information you just have the latest known state but a time series database is perfect when you want to have the history of the change of this metric over time so yeah this one quote here at the bottom of the slides i think it was fairly astonishing right that in 2017 this ibm study stated that 90 of the data in the world had been created in 2015 and 2016. so that's pretty surprising when you think about it but this indicates right that we have this explosion of information and we're interested in collecting more of the the information that we're generating and trying to make sense of it and the more information that we have and the more that we're trying to make sense of it breeds this necessity to have new tools that are specialized in tracking how these metrics change over time and then this graph here on the right is basically showing the rise in popularity of time series databases so you can see it's the fastest uh growing category of database at the moment which is i'd say a side effect of the prevalence of time series data right that it's pretty much everywhere and people are realizing that the value of of uh tapping into this right awesome uh so thanks for explaining time series data so now i'm just a bit curious what kind of data page should i use so i know i have to use time series but what should be you know the service that i should use well uh of course i'd recommend quest db so typically when you ask

### QuestDB [12:10]

an engineer or some developer what the slowest part of their stack is it's typically the database this is really common commonly a bottleneck in most services so quest eb was uh has been since 2013 has been under development it's undergone several major revisions and right now yeah is open source version 506 is the latest and yeah it uses sql as a query language which is a big advantage over a lot of other different systems where you have to learn a specific dsl some kind of esoteric query language so sql is has been around for 50 years now so it's battle tested it's pretty much a favorite and a lot of the analytics world because it's it answers questions about data very intuitively right so you have a lot of common english keywords that make sense when you when you're reading it and yeah it's just right just the right language that you need for asking questions about what data looks like ah that's good

### Creating a table [13:30]

so i don't have to learn another language i can reuse my knowledge of sql exactly yeah that's awesome okay uh so let's go ahead and create a table in question so brian would you help me out my knowledge of course sql is a better university at the moment yeah so what we need to do is use create table keywords okay so it's create table and then the table name yeah that's it awesome and over here i specify the fields yeah so we're interested in two metrics right so we need one for usd and this one we want to give it a type so we can say float in this case okay and then eur will again be float exactly and then there's a third metric that of course we're interested in which is the time right so what type this would be and this one is a timestamp okay i mean right so at this point this query will will run perfectly fine but what questdb has is a feature that allows us to index this timestamp row so what we can do at this point is use a timestamp function so it's just after your closing bracket there yeah timestamp and then open globe yep and then the timestamp column interesting so by run by designating one column as our timestamp then we have extremely fast performance on reads on queries basically okay so i'm just going to hit unrun i said i got a message that the table was created now if i refresh this i see my table that's good so now we have a table in

### Creating a timestamp [15:30]

quest tv which has fields that's usd eur and timestamp so now i know what fields i want i'm gonna use the set node to set these values now in the set node you can set either a boolean that's true or false a number or a string since all these values that we get are number i'm going to select number and the name i am going to enter is usd now uh the value in the name field should be exactly the same that you have in your table and now we want to get this value from the previous node so i'm going to click on add expression which opens the expression editor for us and then reference this value from the previous node usd and i'm going to do this same for eur as well so number you are and then reference the value from the previous row now here's a quick pool for you all since i am not getting the value for timestamp right i'm just curious if we can use some kind of code snippet in the expression editor to run to create this timestamp for us please let me know what do you all think okay so moving forward i see a hundred percent of the audience said yes so that's good yes we can run code snippets and in the uh in the expression editor so over here we are gonna create a timestamp again i'm gonna open the expression editor and in this i am gonna run javascript code snippet so to run a code snippet you have to use curly braces and wrap it in the curly braces i'm using the javascript date object and i am gonna now since we have to pass on the timestamp in milliseconds i am multiplying it by 1 000. so we use the date object and we are using the dot now method which will give me the exact timestamp and then i am multiplying with a thousand to get to convert it into milliseconds now if i execute this node it will give me the information that i've created in this particular node but it also gives me the information from the previous node which i am not interested in so i am going to toggle keep only set to true and if now if i execute it will only see the information that we have set in this particular node you see now we just got the usd eur and timestamp

### Creating QuestDB credentials [18:50]

timestamp so we now have the data that we want to ingest to quest eb so let's get the questdb node now for quest vb you have to create your own credentials so you can click on create new enter the name for your credentials select the host i am currently running it locally on my machine so it's going to be localhost and i am just using the default value so i'm not going to change any of this now again i have already set this credentials for myself so i'm not going to go ahead and create it but if you want to know more how you can create your own credentials you can click on the open credential doc link over here this will take you to our documentation where you can learn how you can create these credentials so i'm going to select the credentials that i already created i think we may have lost herschel for the moment no worries while he comes back so tell me a bit about um question because i see in the chat we had a question around how does um question be compared to postgres and influx db yeah so um with postgres so what question b has is a postgres connector or at least we allow people to read and write data through postgres so any programming language that you're using has packages that allow you to read and write two databases using this credentials so it that opens up quite a lot of compatibility it also allows people to um yeah easily integrate uh with their own existing stack right so um this it's fairly easy to get up and running with the postgres credentials um then with influx so influx is also a time series database and we also support reading data through uh influx db line protocol and this is uh typically for higher ingestion so we can write quite a lot faster using influx db line protocol that would be let's say in an industrial setting or if you have if you're writing maybe a million records per second that's the kind of use case that you would use influx db line protocol but otherwise for compatibility or for reading and you would use uh postgres so that's cool so you can basically use the influx db line protocol in addition to the standard sql as well that's right yeah yeah so uh this we also have a rest api this which is useful for bulk exports or for bulk importing but typically yeah high ingestion rates people would are using influx db line protocol and then for querying the data using postgres connector cool perfect all right uh it seems hershel is back and while herschel shares her screen i had another question for you um ryan was around um so i think it's more of a based on what brian said is like the postgresql protocol so that's really cool because um the nodes in n10 are actually written in typescript so we also use the postgresql protocol and that's how we are communicating to quest db as well cool all right um then please take it ahead brian and hashem thank you uh awesome uh so brian i was talking about that we now have this workflow running right and it is executing every second and ingesting some data to quest db so we will be having a lot of data in quest db so how can we make sense of all this data yeah so now what we want to do is query from this and we want to ask what are the trends in this data over time um so we need to be able to ask that in a convenient way from quest tv awesome so we would be

### Running SQL queries [23:20]

running some kind of queries then yeah so we can run sql queries so can you just uh take us through a few queries that we can of course yeah so here we're on the web console so here we can write some more queries directly so if we want to see what's in the the btc table we can do it the long way which is select star from btc and this should give us everything that's in the table so far right so we have 56 rows the shorthand is you can get rid of the select star from so we can just write ptc and that'll give us everything that's the same result that's interesting okay so one of the main benefits of uh of quest is that we have a few sql extensions for time series data so let's say i want to make uh let's let's say first let's have a look at the average so let's do select average this is a function right yeah that's it that's a function yeah so we need open close brackets uh usd from btc so this will give us one row and that's the average across the the entire table so that's all of 56 rows or however many we have so let's say we want to uh create one-minute aggregates or five-minute aggregates and see what the average during those five minutes uh buckets are what we can do here is add a sample by and then we specify the time that's it so we can do one or the number and then followed by the unit so we can do one m for one minute and we need a the timestamp would be good as well so we can do select the average usd and the timestamp that's it right so we have three rows so our workflow has been going for three minutes and then for each of these uh one minute buckets then we have an average of what the usd uh value is interesting so if you had a table that's maybe has a month worth of data you can sample by day hour of course and split this into aggregates however fine grains you want the averages to run by awesome that's really interesting so for the units for r and days it would be h and respectively it's so for days then you would instead of one m you would use one lowercase d okay um and for month is m exactly okay awesome yeah so now this makes more sense to me and being an automation enthusiast i won't you know like to come to the control every time i want to run the run a query so i what i'm gonna do is i'm gonna build a new workflow which will connect my edit and uh workflow with uh with my slack workspace right so this would also help me get the output on a mobile phone so let's just say i am using slack on my phone and i just run this slash command so that ah and this will give me the output on my phone as well also i think this would uh help me you know to make bet better decisions whenever someone tweets about bitcoin or whenever elon musk you know tweets about cryptocurrency of course yeah awesome uh so talking about slack we

### Slack [27:40]

first need to create a slash command in slag so you have to first login to your slack account and then head over to api. slack. com and then click on create a custom app so now we have to enter an app name so i'm gonna call this bitcoin picker and select the workspace so i'm gonna use a dummy workspace at the moment so now our application is created the next thing is to create a slash command so i'm going to click on slash commands click on create new command and i can define the command that i want so i'm gonna call this avg let's short for average and over here it is asking for a request url so now this is the url that i will get from the web hook node so let's go back to an attack so now since i want to create a new workflow i'm going to click on new and now add the webhook trigger node now and there are two different urls a production url and a test url we use the production url when we have completed building a workflow and our workflow is inactive state that is we are not executing the workflow manually since we are building this workflow i'm gonna use the test url i'm gonna click on this to copy the url another important thing to note over here is this uh slack makes a post request so instead of get i will change the http method to post and now i will add it to my command over here also i have to give a short description so i'm just gonna give you some short description help me make decision and save this so now my slide slash command is also created the next step is to install it on your slack workspace install to workspace and i have to give it uh allow it so i'm just allowing it to get added and awesome so my application is now added to my select workspace so now let's test it out so the first thing is you have to save your workflow whenever you are using a trigger node this would register the webhook id so i'm going to save this first i'm also gonna call this help me make decision and i'm gonna execute the node i'll come to my slack workspace i've created a channel called bitcoin where i'll make the average um yeah it should be average slash command and i see some uh output over here so if i go back to n10 we can see some data that's written by the uh by the slash command so over here you can see that we get the channel id the channel name we also get the command that was sent and also the text if the user passes on some text so let's just try to pass on some information and see what we get avg and i i'm just going to do one m again we got a message over here and if i go back to under 10 i see the i can see the information and the text as well so uh we often time might want to you know query data for a different sample right for currently we are sampling it for one minute but i just say i want to sample it out for five minutes so it wouldn't make sense to open the workflow change the query and then you know save the workflow and then run the slash command so what i'm going to do is i am going to use an if node which would check if the user has passed and a parameter with the slash command or not and if they have passed on the uh pass on the parameter we will create a dynamic query which will query that based on the sample and then we'll return that value if the user does not pass a parameter we will have a default query which will get executed and will uh share that result on slack i'm going to do the if node and i'm going to add a condition and over here since the value that i'm getting is a string i'm going to select string i am going to select is empty operation because i want to check if it's empty or not and again i'm getting this value from the book node so i'm going to do a add expression current node input data json and inside the body i get the text so now if i execute this node you see that we didn't get any output in the true branch but if i go to the false branch we get the output in the false branch because the condition over here is false and that's why we are getting the output in the false branch so let's go ahead and connect the other nodes to the false branch so that we have uh the queries that's being written

### Query [33:20]

by quest db so now since we want to query requestdb we'll add the quest db node i'll select the credentials that i've already created and now we are interested in executing a query so i'm gonna select execute query and this query is gonna be dynamic so i'm gonna click on the get icon add expression and i'm just gonna copy this query and paste it over here now the only value that's changing over here is the uh time period so this is 1m so i'm gonna get this from the webhook node again correct node input data json body and then the text so if i execute this node we'll get the value uh so yeah so we're basically running a query and we are getting the value for that so if you see in the table format we get the average as well as the timestamp now this timestamp does not look so readable so i am going to use another core node of n10 that's the date and time node to convert it to a custom format i'm going to get this value from the previous node so current node input data json and then timestamp i am gonna call give this the property name timestamp and since i want a custom format i'm gonna see the custom format and then i am gonna call this have dd mmm okay so dd basically stands for date so we'll have uh today's date that's 29 point and then the month that zero three and then the r minute and the second if i execute the node you see that we got the timestamp in a new format the next step is to add the slack node

### Slack Node [35:30]

so that we get the message on slack i'm going to select the slack node now for this we have to use a create a credential so i'm going to create a new credential since i created a new app now over here we need to access token so i'm going to copy this edit over here bitcoins it's like app another important thing to note over here whenever you are working with slack is to give your application the right scopes so i'm gonna go ahead and give my application the scope to send messages to slack so under bot token scopes i'm gonna give it the chat right as well as the chat right public scope now you see that's like prompted me to reinstall the application since we added new scopes so i'm gonna just gonna allow this application awesome so the application is now updated and now we want to send this message to slack we have to specify this channel so i'm going to get this again from the web book node but this time i am gonna use an expression so this expression will basically remain same throughout all the messages that we send and to make it look a bit pretty i am gonna use fields and i'm gonna call this time so this will be this will return the time and then we'll add another field which will be the value in usd so now if i execute this node it will send me the message on slack so you see that we got some output over here let's take a look how it looks on slack so we got the time and the value in usd as well so we have completed the first part of this workflow so this will basically uh output a value based on the input given by the user so let's test it out with a different value so i'm going to execute the workflow and let's try it out for 30 seconds we got a message workflow got started and we are getting the values for uh for 30 seconds interval

### Static Query [38:30]

so we now have a dynamic a query which runs based on the input given by the user but what if a user does not give an input what if they just you use this slash average command right so we also we still want to give them some uh output so for that we will use uh a static query so i'm going to duplicate the node connect it to the true branch and i'm gonna remove the expression and by default for now we want to uh give out the value sampling by one minute if i execute this node we won't see any data again why because the condition is false see we don't have any value in the true branch so let's just change that so i'm gonna execute the workflow and run this command this time without any parameters we got the message workflow got started it's good and we our quest db node also ran some queries so we got the data for a sample of one minute so i am gonna duplicate these nodes

### Execute Workflow [39:50]

and connect it to the previous node so that we get uh so that i don't have to create the whole work of all these nodes again and if i open this node and execute this you see that we now have this information i'll also duplicate the slack node and connect it with a date and time node so now if i execute the workflow and call the average slash command the workflow will get executed and it will run the stat the static query and then execute it yeah and then written me the information

### Renaming nodes [40:40]

now at this point of time there are a few notes and we are kind of repeating the the notes right and it might not make sense when you come to this workload for a couple of days so let's rename this node so that it makes more sense so i'll first rename the if node if no parameter and then i'm going to rename it to default query and this is dynamic query

### Whats next [41:20]

the idea over here to rename the nodes is to help you get a better visual cue so whenever you come back to your workflows and you take a look you know what is happening at each particular node so it gives you kind of a more readable format so if no parameter you're on default query if it's false you'll run a dynamic query right so this uh these are a few things you know that are really useful when you're building your workflows awesome uh so brian now that we have completed this workflow what's next what else can we do with entertainment quest db well yeah there's a ton of different things that we could do i mean so part of the benefits of using questdb is that we have extremely low latency right on reads now also on rights but if you have a huge data set right and if you want to query across years worth of data you're also going to get sub second or a couple of hundred millisecond response times there so we could reuse the same slash command but to give us averages across some other data set that's maybe gigantic or what we could do is take inputs from different sources so some of the na10 nodes that we have is shopify and we have some other e-commerce nodes right so we could be taking sales data from these other nodes salesforce of course so we could be interesting from these nodes and we could be uh running some queries on this based on metrics that we're interested in so i mean the obvious thing makes it would be for some bi insights right if if i want to uh run the same kind of query um at a different date like what does the last week look like in this particular data set how do my sales or conversions or visitors or sign ups how does this look like you're running the same query but over a different time period so you could easily rewrite this automation to look at this kind of data awesome uh another idea that i have as you know right now we are just posting some textual information maybe we can also you know create a graphics around that and you know have a better visual uh visual cue of all these data so maybe we create some graphs for this which can be really helpful when you know we are looking at the data yeah of course so in quest tv itself we have some uh basic charting functionality and some visualization tools um but of course the every i think images especially of data like this first chart that we saw the apple uh apple stock that this communicates the data almost instantly right so when we have the ability to make some nice looking charts and graphs of the data um it really helps us to understand what's going on exactly yeah so after uh let's just say you want to create a chart for this particular workflow right so just to give an idea what you can do over here is right before the slack node you can connect a banner be a node or an api template node right which allows you to manipulate uh manipulate some information on the image and you know create dynamic image and then send it to slack so you are not just sending out the textual information but you are also giving them a graph over the time period yeah of course i mean if you're uh also automating this kind of stuff is the the perfect it's the perfect task for automation tools is for i want to create a graph of this week's insights on whatever kind of subject because you're always going to be looking for a weekly report on um some kind of business related information right so a bi dashboard where you're able to generate this kind of graphs makes total sense to me this is something that you should be automating awesome uh so we showed you how you can connect

### Pool [45:55]

different services to receive data and trigger your workflow so just another quick time for pool so and yeah here you go so let me know what do you think is it possible to connect to services that don't have a node in n10 okay so we got a mixed reply over here that's interesting okay so the answers are now switching over as i'm just gonna wait for a couple of more seconds cool okay uh so you can you know connect to different services that don't uh natively have a node in anytime provided that they have an api so you can use the http request node to connect to these services so again the http request node has a lot of request methods if you want to post some information you can use the post method if you want to delete something from your uh from there from the rest api endpoint right you can use the delete request method over here so there is a lot of flexibilities if your uh if your api uses any kind of authentication you can select the authentication method and you can configure your authentication uh credentials and then you can uh connect that service with an attempt now let's just say if you want to trigger a workflow which is which does not have a native trigger node dna 10 right so again you can this is uh where you can use the webhook node so with the webhook node you can you know add the webhook ui to that particular service so any event any external event happening in that service would trigger your workflow and then you can you know build your own automation so moving forward brian uh where should we go and learn more about quest db yeah so we're we've open-sourced the repo it's on github so um i would say if you're interested in having a look at what that looks like feel free to check that out um definitely if you want to try it out you can go to demo. questcb. io there we have a couple of different data sets there's the 10-year new york city taxi data set where you can query 1. 6 billion rows um and that's at in a couple of milliseconds probably so i think people are going to be pretty impressed with the performance there on that so yeah definitely try out the demo we also have a drop down there with some example queries to give you an idea of what you can do and also you can just drop by air slack so we have a community slack it's slack. questdb. io just if you have any questions drop by say hello awesome and try out anything you can go to n10. io to learn how you can uh get started with ntn and go to ndn. cloud if you want to use your hosted services and if you have any questions around n10 feel free to ask them on community. net. i hey thank you both of you um shall we take a look at the questions now yeah sure cool i can see a few coming in here um the first i see is i do not use slack at work uh we use matamos can i use a similar workflow with other messaging services that's a really good question so at any time we also use matamost and we have a lot of workflows uh that use the slash command so you can use this uh similar workflow with metamost uh just to give you an example of one of those slides command we have a slash call command so whenever we want to jump on a call with a teammate we just use the slash call command and it says a video call link with a teammate so that you know we can connect with them talk to them have a meeting and yeah and that's the automation that we use cool um next one i think this is for you brian do you have authentication or user management in qsdb um so at the moment you can authenticate using the postgres credentials which is the ones that we're using um here in this workflow um authentication for a larger number of users before you hit the database that's coming soon cool so right and uh then herschel i think this is for you um can you download the workflows that are created in nhn and if so how can i share them yeah uh so you can download the workflow

### Download workflow [51:20]

so let me show you how it's done so if you click on the uh option over here you get the option to download so it will basically download a json file so if not so if it pop it will show up on the screen so yeah it will download a json file for me the format is json and you can then share this json file with your teammate or your colleague uh another interesting fact over here is you know since i already have my credentials configured for slack right this con this credentials won't be shared with anyone else so the credentials are decoupled so you're even if you're sharing your workflow you're not setting your credentials so that's how you uh still remain safe while you're setting your workflow cool and another thing that you can do is um like select the nodes by dragging across them or do a command c and that copies the json of the workflows that you can again uh copy paste as well cool um the last one i'm happy to take this is uh does n18 support nodes and functionalities for custom iot applications namely mqtt co-app opc ua um so what you can do with n8n is so nfm does support a couple of different uh streaming inputs especially when it comes to iot applications or iots applications so we do support um kafka we have support for um i think amqp 1. 0 um correct me if i'm wrong we also have mqtt support um and for the things like where like we do not have anything existing uh you can also use the function node uh to write custom javascript snippets now just as an example um i purchased some cheap um electr what how do smart bulbs exactly yeah smart bulbs because uh philips hue is too expensive and uh essentially they used um gateway called um like the service was called two-year smart and uh they had an sdk available but we didn't have a node for that so we used their rest api and were able to connect with that and in the process of authentication um it required a very specific kind of cryptography of the key before sending it through which wasn't available at that moment in the crypto node so you can also import uh libraries in the function node as you would uh while writing a node. js code or maybe something in python um but yeah like if you are not familiar with programming another thing you could do if you would like a node for your uh for use cases is uh open uh feature request in the community. nhn. io and the more reports it gets uh the sooner we'll be able to pick it up and uh yeah that's another way of requesting a feature that you might want in anytime all right so that seems like um that's the last question so with that i'd like to thank you all for joining us today and thank you for presenting hershel and brian and in case we haven't answered your question yet or if any other questions come up yeah please do visit us in community dot nh dot io and we shall get back to you very soon as well so thank you very much and enjoy the rest of your day thank you thanks bye
