How we use Orq.ai, LangChain, LangSmith, and LangGraph
AI - Koen van den Brink
At El Niño, we develop many LLM-powered features and applications. When we started doing research to build a suite of LLM tooling, we could not find good comparisons for some tools. Let’s get into it.

Do you have to choose? And if so: what would you even base your choice on?

The landscape

In recent years, the landscape of AI development tools has expanded rapidly, offering developers an increasingly sophisticated array of options for building and deploying AI applications with LLMs. As these tools become more specialized and interconnected, understanding their distinct roles and capabilities has become crucial for developers. However, also more difficult.

In this post, we will discuss four key players in the AI development ecosystem: LangChain, LangGraph, LangSmith, and Orq.ai. Each tool serves a specific purpose in the AI development lifecycle, from foundation and development to workflow control and orchestration. Some compliment each other and some compete. So which technologies should you use and why?

LangChain: The Foundation

At its core, LangChain standardizes the interaction between LLMs and other components, making it easier to build complicated AI applications without dealing with the underlying complexity of each individual component.

At El Niño, we use LangChain extensively as the foundation for our AI applications. Its abstraction layers help us focus on building business logic rather than dealing with the intricacies of LLM integration. We particularly value:

  • The standardized interface for different LLM providers, allowing us to easily switch between models
  • Built-in memory systems that help maintain context in conversational applications
  • The extensive collection of pre-built chains that accelerate development
  • Tools and agents that enable our AI applications to interact with external systems

On the other hand, we've also encountered challenges, particularly when building more complex applications that require custom chains and specialized agents. This is where complementary tools like LangGraph become valuable.

LangGraph: The Flow Controller

LangGraph introduces state machines to AI development, providing a structured way to manage complex workflows and decision-making processes. State machines help developers model AI behavior as a series of discrete states, with clear transitions and rules governing how the AI moves between these states.

As an extension of LangChain, LangGraph leverages the foundational capabilities while adding sophisticated flow control. This combination allows developers to create more deterministic and controllable AI applications. The integration is seamless, meaning you can use all your favorite LangChain components while gaining the benefits of state-based control.

When building complex AI workflows, LangGraph shines by enabling:

  • Clear visualization of application states and transitions
  • Better error handling and recovery mechanisms
  • More predictable behavior in complex scenarios
  • Easier debugging and maintenance of AI workflows

At El Niño, we've found LangGraph particularly useful for applications that require multiple decision points or parallel processing paths. For example, in customer service automation, it helps manage the flow between different response types, escalation procedures, and follow-up actions.

In our experience, we have found LangGraph specifically helpful because of the streaming option with human intervention. We have set up a special router that provides you the next agent that would run, but asks the developer for confirmation first. The developer has the option to roll-back to a previous graph state or change the routing. This allows to find and debug multiple issues with only one run of your system.

Orq.ai: Our choice over LangSmith

Both LangSmith and Orq.ai serve as development and monitoring platforms for LLM applications, but with different approaches and capabilities.

The relationship with LangChain is a key differentiator. LangSmith is developed by the LangChain team and offers native integration, making it a natural choice for developers heavily invested in the LangChain ecosystem. Orq.ai, while supporting LangChain, takes a more platform-agnostic approach, allowing integration with various frameworks and tools. For us at El Niño, we prefer Orq.ai for this reason. It allows us to more easily switch from the LangChain suite to newer or better technologies when needed in the future.

When it comes to debugging and monitoring capabilities, both platforms offer comprehensive solutions. LangSmith provides detailed traces of chain and agent executions, helping developers understand how their LangChain applications behave. Orq.ai offers similar debugging features but extends them with real-time monitoring and alerting systems for production environments.

Performance optimization is another crucial aspect where these platforms differ. LangSmith focuses on optimizing LangChain-specific components and provides tools for testing and evaluating different chain configurations. Orq.ai takes a broader approach to performance optimization, offering features like automated model selection, cost optimization, and scalability management across different deployment scenarios.

In general, we use Orq.ai for quickly prototyping and setting up RAG applications. Since this is 90% of our time spent on LLM-related projects. We may consider introducing some custom vector-store retrievers with LangChain in production, but usually Orq.ai handles it well up to and including production.

An outline of the tools we use at El Niño. Orq.ai specializes in RAG, and is getting better in other areas.

About deployment

We should consider that all of these tools apply abstraction layers - and overhead - to the products we create with them. At El Niño, we use them for rapid prototyping and then investigate whether there needs to be significant optimization steps for a production environment. Usually, these can be achieved with parallelization, caching, etc. However, in some cases this suite of tools may introduce too much overhead and a production-ready product would have to be rebuilt using as few abstraction layers as possible.

Be careful:

  • LangChain provides easy-to-use abstraction layers, but it does mean that building custom calls to agents gets a bit more complicated.
  • All these tools provide overhead in the form of latency for your backend.
  • All these tools are in early stages of development and may change significantly, requiring some refactoring.
Conclusion

In short, the tools we use at El Niño are the following:

  • LangChain: development of AI agents and integrations with different AI components.
  • LangGraph: development of multi-agent systems and managing complex AI system states.
  • Orq.ai: a platform to prototype, test, debug, and manage our AI systems.

We would recommend starting with learning LangChain, for quick prototyping. After this, picking up other tools like Orq.ai as needed is a good next step.

Have a project in mind?

Let's talk and discover the possibilities through technology together. We'd love to learn more about your business, idea or product.

Get in touch