LlamaIndexTS

Data framework for your LLM applications. Focus on server side solution
GitHub
2.62k
Created 2 years ago, last commit 15 hours ago
129 contributors
2.19k commits
Stars added on GitHub, month by month
N/A
N/A
N/A
N/A
N/A
5
6
7
8
9
10
11
12
1
2
3
4
2024
2025
Stars added on GitHub, per day, on average
Yesterday
+1
Last week
+2.9
/day
Last month
+3.5
/day
npmPackage on NPM
Monthly downloads on NPM
5
6
7
8
9
10
11
12
1
2
3
4
2024
2025
README

LlamaIndex logo

LlamaIndex.TS

Data framework for your LLM application.

NPM Version NPM License NPM Downloads Discord Twitter

Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in JS runtime environments with TypeScript support.

Documentation: https://ts.llamaindex.ai/

Try examples online:

Open in Stackblitz

What is LlamaIndex.TS?

LlamaIndex.TS aims to be a lightweight, easy to use set of libraries to help you integrate large language models into your applications with your own data.

Compatibility

Multiple JS Environment Support

LlamaIndex.TS supports multiple JS environments, including:

  • Node.js >= 20 ✅
  • Deno ✅
  • Bun ✅
  • Nitro ✅
  • Vercel Edge Runtime ✅ (with some limitations)
  • Cloudflare Workers ✅ (with some limitations)

For now, browser support is limited due to the lack of support for AsyncLocalStorage-like APIs

Supported LLMs:

  • OpenAI LLms
  • Anthropic LLms
  • Groq LLMs
  • Llama2, Llama3, Llama3.1 LLMs
  • MistralAI LLMs
  • Fireworks LLMs
  • DeepSeek LLMs
  • ReplicateAI LLMs
  • TogetherAI LLMs
  • HuggingFace LLms
  • DeepInfra LLMs
  • Gemini LLMs

Getting started

npm install llamaindex
pnpm install llamaindex
yarn add llamaindex

Setup in Node.js, Deno, Bun, TypeScript...?

See our official document: https://ts.llamaindex.ai/docs/llamaindex/getting_started

Adding provider packages

In most cases, you'll also need to install provider packages to use LlamaIndexTS. These are for adding AI models, file readers for ingestion or storing documents, e.g. in vector databases.

For example, to use the OpenAI LLM, you would install the following package:

npm install @llamaindex/openai
pnpm install @llamaindex/openai
yarn add @llamaindex/openai

Playground

Check out our NextJS playground at https://llama-playground.vercel.app/. The source is available at https://github.com/run-llama/ts-playground

Core concepts for getting started:

See our documentation: https://ts.llamaindex.ai/docs/llamaindex/getting_started/concepts

Contributing:

Please see our contributing guide for more information. You are highly encouraged to contribute to LlamaIndex.TS!

Community

Please join our Discord! https://discord.com/invite/eN6D2HQ4aX