this will be one of my favorite competitions for 2026. i've used the AI SDK so much this past year, very excited somebody who knows their shit VERY well is making a competitor :]
-- my links
twitter: https://x.com/joshtriedcoding
github: https://github.com/joschan21
Оглавление (3 сегментов)
Segment 1 (00:00 - 05:00)
Yo, you know the Versel AI SDK, right? That really popular, really modern way to build a good AI agents. Well, they have a new competitor, Tanstack AI. Tanstack is the software behind React Query, some of the most used things in the React ecosystem. So, if somebody can pull off becoming better than the Versel AI SDK, this might be it. So this is Tanstack AI and it's the first software I've ever seen that actually has a shot at competing with the Versel AI SDK and Tensac AI is not a addition to the Versel AI SDK. It's a direct competitor. You either use the AI SDK or you use Tensac AI and this looks tempting man. A powerful open-source AI SDK with a unified interface across multiple providers. This unified interface, by the way, is the biggest selling point for either of these SDKs because the most useful part we get from these SDKs is not needing to worry if we use Open AI, if we use Enthropic, if we use Google Gemini, it's all the same code, right? The code doesn't change. That's by far the biggest benefit that the AI SDK provides. But now also Tanstack AI, right? It works with vanilla, TypeScript, React, Solid, basically anything. It all routes to the Tanstack AI client here in the middle. We use any LLM. It's all the same code, just a different adapter. And then it ports that over to whatever language we're using. And yes, this also supports PHP and Python. And feature-wise, to be honest, for this pre-release version, I'm impressed with Tanstack AI. This is not a 1. 0. This is not even a beta. Tanstack AI is an alpha but it already ships multi-provider support like OpenAI, Anthropic, O Lama, etc. Because this is the main value proposition, right? This unified API, the same interface across all providers no matter what AI you're using. It doesn't actually matter. The protocol stays the same, everything, but the adapter stays the same with automatic type inference from adapters. This is really cool because what that allows Tanstack AI to do, I prepared a little code demo here, is this. If we use the open AI adapter here on the left hand side, we pass the messages. Everything stays the same. But for one, we get model autocomp completion, right? It knows which models exist on the OpenAI adapter. This is cool, but this is not unheard of, right? This is not this super unique feature. But at the same time, we also get provider options specific to OpenAI, right? So anything we now pass in the provider options, the prompt cache key, the retention, the safety identifier, anything we want is now type safe and according to the open AI provider, right? If we used enthropic here, some of these provider options might not even fly for anthropic, so they wouldn't even be provided. So this is 100% type safe and the DX even for a pre-release version is pretty impressive. And then the last important thing that Tanac tries to do here is tool and function calling because this is the most important part for building agents, right? An agent is an AI that has access to tools and it does that really well with an automatic execution loop that doesn't need manual tool management. That's a really fancy way of putting basically we have loop control in our SDK, right? It's very similar to how the AI SDK works and it works with this agent loop strategy property that we have over here. We can set this for example to max iterations of five. If the agent went through five iterations and call tools, hey, we're done. We want to return to the front end. Similarly, there's other ways we can even combine strategies. So, we can use multiple at the same time. This takes an array and so on. I think this is one of the most important parts of building modern AI agents and Handstack does this really well. Right? So, it's basic what it's supposed to be, right? This is a pre-release version. We have provider support, tool calling. This is pretty much it. But we also have multiple language support that hasn't shipped yet. And their main attack vector on the AI SDK is this. It's not tied to the vendor. Tanstack AI is a pure open-source ecosystem of libraries and standards, not a service. We connect you directly to the AI providers you choose, OpenAI, Anthropic, and so on with no middleman, no service fees. They might be jabbing at the Versel AI gateway here and no vendor lockin, which doesn't really exist for the AI SDK either. I will get to that. Just powerful type- safe tools built by and for the community. This last part sounds it's been chat GPT generated. Anyways, man. So, the attack factor they have on the AISDK is this vendor lockin. Why does that matter? The Versel AISK is by far the biggest and most popular way to
Segment 2 (05:00 - 10:00)
build AI agents, right? They introduced it back in 2023. Man, I have an status. Holy It's been a while. June 2023. And since then, the AISDK has been going crazy, man. Just this year, right at the beginning of 2025, it was at like 800,000 downloads, and now it's at 5. 4 million. So, it's like 5xed, 6x in growth. It's been going absolutely crazy. And the interesting part is the AI SDK from Versel is building the entire foundation for their AI stack, right? So they're building a lot on top of this the AI elements which is basically a UI toolkit that helps you build AI applications in hopes of course that you're going to host these apps on Versel. They host the AI accelerator which is a startup program specifically for AI applications that hopefully you're going to host on Versel, right? They have their open-source program. They're investing very heavily on the AISDK and also make money from it. for example with the AI gateway right so this is a paid product that Verscell introduced recently Verscell AI gateway that basically replaces open router it's one API key that you can use for any model this is a very simple concept that works at scale proven by open router the AI gateway for developers access hundreds of AI models through a centralized interface and ship with ease basically use any model just by passing a different string as long as you have your environment variable set for the AI gateway. They're building their entire stack on the control they got or still get through the Verscella AI SDK. So business-wise, this is a extremely clever move and I think that is the reason why Tanstack is now trying to do the same. Tanstack on the other hand is a very big open source ecosystem but they don't have AI elements. They don't have an accelerator. They don't even have a gateway. They are now starting in this niche which the Verscella AISDK has been dominating for two years, man. So to say this is going to be a hard start, right? A really hard time for Tanner and the team is an understatement. It's extremely difficult. But at the same time, they have a massive ecosystem. Tanstack start router query table that is so established in the React ecosystem. It is de facto the best way to implement data fetching, virtualization, tables, everything Tanstack offers. It has become so widely used that I genuinely think these guys have a shot of dethroning the AI SDK. So let's take a look at how this looks in code, right? And this is it. It's really, really easy. This is our entire back end. We can dstructure the messages and the conversation ID from the request. And this is all the code we need to get started with Tanstack AI. Man, we have a stream in the AICK. This would be called generate text or stream text because now we're actually streaming. We're passing it the adapter of OpenAI, the messages, the previous chat history, the model that we want to use fully type safe and the conversation ID and then we return a server sent event stream to the client. Now in the AICK, we have a really useful hook that we can use called views chat. It's the same thing in tanstack AI man. And then we just have a connection in the AI SDK. We have transports here. We have a fetch server send events connection that we can use. And just like that, we built a complete AI chat hosted on local host that we can ask, hey, how are you? Powered by GPT40 that we specified here on the back end. And this looks super because I didn't pay any attention to how this is styled, but it works, right? It just streams the event in so little code. This is a beautiful API because Linslay and team know how to make good APIs, right? They've proven that absolutely with React query, right? That has become the absolute standard to do this. And let's take a look at the protocol, right? Because this is different than the Verscell AI SDK protocol. Let's say hello world and see what it responds. And then take a look at the data that we get back here. And never mind, Zen doesn't even support that. So let's go into Chrome and take a look at the protocol. This is it. Interesting. This looks very different than the AIDS protocol, but also kind of similar, right? We have a type content ID. There's a response ID that we can use to enable durable streams for this. We have a model time stamp and then the delta. Right? So the chunking is very similar, right? We send along deltas through the server send event stream and that's how we get the streaming effect on the client as it's generated. And we have a role of assistant, role of user. So the
Segment 3 (10:00 - 10:00)
important parts that feel native to the AI SDK like for example the role, the content, the time stamp, it's very similar in hands like AI man. So I'm genuinely curious how well this will do especially considering that you know the AI SDK is so established. These guys absolutely know what they're doing though. They have a bunch of partners and I'm curious man. I like the approach. It's no vendor lock in. It's very easy to set up and you can use any AI model. My only hope is that they will support open router in the near future which is basically the Verscell AI gateway. But I like that version more, the open router version and I'm very excited to see where this goes, man. A powerful open-source AI SDK with a unified interface across multiple providers. Let's see how it goes. Thanks so much for watching. I will see you in the next video. Until then, have a good one and bye-bye.