IN-DEPTH TECH

5 min read

Published on 06/21/2024
Last updated on 02/03/2025
Streamlining LLM interactions: Qwik as a LangServe playground
Share
If you are considering using LangServe as an API server for your large language models (LLM) with LangChain but want to use a framework other than React, then you’re in luck. In this article, I will explore the integration of the Qwik frontend framework with Langserve to demonstrate how Qwik's efficient server-side generator functions enable seamless event streaming of LLM interactions.
The current LangServe Chat Playground is clean and well-written. However, the Azure/fetch-event-source library being used increases complexity and boilerplate code with React. With a new stream-based fetch-event-source library leveraging generator functions combined with Qwik, we can dramatically reduce the boilerplate code to something that is easy to follow along with.
Discovering a more efficient approach to React
I was very intrigued as I watched the LangServe Chat Playground Launch video and wondered what the UI was written in and how it accomplished receiving and displaying the streamed tokens from the LLM in a chat-based interface. As tokens are generated by the model, the server sends the tokens to the client incrementally, typically using server-sent events (SSE) or WebSockets.
After digging into the LangServe Chat Playground code, I discovered it used Azure/fetch-event-source. This library handles server-side events and utilizes callbacks as a standard approach. The more I dug into how the Chat Playground React code was written and observed its existing complexity, the more I became convinced that there was a more efficient approach.
Complexity of React’s current code
Let us look at some of this complexity. Take a few minutes to read useStreamLog.tsx and useStreamCallback.tsx. I would assert that you must be fairly advanced in your React knowledge to grok this.
For example:
Gist: engineersamuel/e4c3a5f44591f8563fed6c5d5261b182
Now, try to trace the flow from typing input to receiving a response from the LLM via the fetch-event-source library.
Gist: engineersamuel/14d6ec613ec537c9c0c41f1ab4824544
While this is relatively simple-looking code, notice the complexity of the functions, which are curried with layers of callbacks. Again, this is well-written code, but it has echoes of "cascading promise hell" with nested then statements vs. the cleanness of async/await. The direct comparison then becomes callbacks vs. generator functions.
Refactoring to Qwik
- After I reviewed and understood the React code and how it was using the fetch-event-source, I saw the opportunity to leverage Qwik with server$ side generator functions to significantly reduce the boilerplate code and result in a much more readable flow.
- Here are the tools Qwik gives us to improve the code: server$ functions enable leveraging generator functions (but this requires the fetch-event-source library be rewritten).
- Signals can reduce/eliminate the conceptual need for React’s useEffect/useCallback, which means we can completely remove the useStreamLog.tsx and useStreamCallback.tsx code, which is where most of the complexity is.
But this can only be done if we have a generator function-based fetch-event-source library. Thus, I created fetch-event-source-stream, a hard fork of Azure/fetch-event-source. I would encourage you to read the code, but I will focus primarily on the usage difference below.
Here is what the React code looks like, which invokes the Azure/fetch-event-source. Notice the `useCallback` and the `onmessage` callback functions, which also cryptically invokes `chunkRef.current?.(JSON.parse(msg.data), innerLatest)
Gist: engineersamuel/a9aec29a8b809a141adb1fd252eb5d2e
Now let us look at the Qwik version qwik-langserve-playground with the new fetch-event-source-stream. Notice the `server$` named `sendPayload` simply invokes and yields the result from the event source. Then, the `initiateStream` (analog to `startStream` in the React version) awaits the result of the `server$` `sendPayload` function and then, performs the associated logic based on the event type. Gone are the refs, callbacks, hooks, and the bulk of the boilerplate and complexity.
Gist: engineersamuel/168aff4b183dc6926129c3803070cf17
Qwik shines here with its ability to call a server-side generator function and seamlessly loop over that server-side function in the client-side code. When I started exploring this, it was not clear how much code I would be able to get rid of, but I was able to eliminate all the React hook-based code.
I remember the days of the old-style React components without hooks, and I understand why React went the way of hooks. However, it can introduce a lot of boilerplate and mental overhead to trace through in more complicated scenarios.
Try it yourself
- pip install langchain-cli
- langchain app new my-app
- When prompted with “What package would you like to add?” type: pirate-speak
- It will say, “1 added. Any more packages (leave blank to end)?)” and hit enter
- Then it will prompt, “Would you like to install these templates into your environment with pip? [y/N]” and type y
- cd my-app/
- Then run `poetry shell`
- edit the app/server.py and replace the `add_routes(app, NotImplemented)` with: Now run `langchain serve` and verify you can access http://127.0.0.1:8000/pirate-speak/playground/
- Gist: engineersamuel/fb15d4358f2fa39cab660fa0d6c2733b
- Next, let us use Qwik.
- git clone git@github.com:engineersamuel/qwik-langserve-playground.git
- cd qwik-langserve-playground
- Run `npm i` to install dependencies
- Create a .env file and add: `LANGSERVE_BASE_URL=http://127.0.0.1:8000/pirate-speak`
- npm run dev
- http://localhost:5173/
- And type “Hello!”
Another win for Qwik as a front-end framework for LangServe
I continue to be impressed with the efficiency and expressivity of Qwik and found it to be an excellent match as a front-end framework for LangServe when you want a separate and independent UI.
For more benefits of Qwik please see my previous blog post: Qwik vs. Next.js: Which framework is right for your next web project?

Get emerging insights on innovative technology straight to your inbox.
Welcome to the future of agentic AI: The Internet of Agents
Outshift is leading the way in building an open, interoperable, agent-first, quantum-safe infrastructure for the future of artificial intelligence.

* No email required
Related articles
The Shift is Outshift’s exclusive newsletter.
Get the latest news and updates on agentic AI, quantum, next-gen infra, and other groundbreaking innovations shaping the future of technology straight to your inbox.
