Documentation
Overview

Extensions Overview

The current Jan Desktop Client has some default extensions built on this framework to enhance the user experience. This guide will show you the list of default extensions and how to configure extension settings.

Default Extensions

The default extensions are in the Settings > Extensions.

List of Default Extensions

Extension NameVersionDescriptionSource Code Link
Assistant Extensionv1.0.0This extension enables assistants, including Jan, a default assistant that can call all downloaded models.Source (opens in a new tab)
Conversational Extensionv1.0.0This extension enables conversations and state persistence via your filesystem.Source (opens in a new tab)
Inference Engine Nitro Extensionv1.0.0This extension embeds Nitro, a lightweight (3 MB) inference engine in C++. See https://nitro.jan.ai (opens in a new tab).Source (opens in a new tab)
Inference Engine OpenAI Extensionv1.0.0This extension enables OpenAI chat completion API calls.Source (opens in a new tab)
Inference Engine Triton TRT-LLM Extensionv1.0.0This extension enables Nvidia's TensorRT-LLM as an inference engine option.Source (opens in a new tab)
Inference Engine Tensor RT Llm Extensionv0.0.3This extension enables Nvidia's TensorRT-LLM for the fastest GPU acceleration.Source (opens in a new tab)
Inference Engine MistralAI Extensionv1.0.0This extension enables Mistral chat completion API calls.Source (opens in a new tab)
Inference Engine Groq Extensionv1.0.0This extension enables Groq chat completion API calls.Source (opens in a new tab)
HuggingFace Extensionv1.0.0This extension converts HF models to GGUF.Source (opens in a new tab)
Model Management Extensionv1.0.30Model Management Extension provides model exploration and seamless downloads.Source (opens in a new tab)
System Monitoring Extensionv1.0.10This extension offers system health and OS-level data.Source (opens in a new tab)

Configure Extension Default Settings

To configure extension default settings:

  1. Navigate to the ~/jan/extensions.
  2. Open the extensions.json file
  3. Edit the file with options including:
OptionDescription
_activeEnable/disable the extension.
listenersDefault listener setting.
originExtension file path.
installOptionsVersion and metadata configuration.
nameExtension name.
productNameExtension display name.
versionExtension version.
mainMain file path.
descriptionExtension description.
urlExtension URL.

{
"@janhq/conversational-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-conversational-extension-1.0.0.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/conversational-extension",
"productName": "Conversational",
"version": "1.0.0",
"main": "dist/index.js",
"description": "This extension enables conversations and state persistence via your filesystem",
"url": "extension://@janhq/conversational-extension/dist/index.js"
},
"@janhq/model-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-model-extension-1.0.30.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/model-extension",
"productName": "Model Management",
"version": "1.0.30",
"main": "dist/index.js",
"description": "Model Management Extension provides model exploration and seamless downloads",
"url": "extension://@janhq/model-extension/dist/index.js"
},
"@janhq/inference-mistral-extension": {
"_active": false,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-mistral-extension-1.0.0.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/inference-mistral-extension",
"productName": "MistralAI Inference Engine",
"version": "1.0.0",
"main": "dist/index.js",
"description": "This extension enables Mistral chat completion API calls",
"url": "extension://@janhq/inference-mistral-extension/dist/index.js"
},
"@janhq/inference-groq-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-groq-extension-1.0.0.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/inference-groq-extension",
"productName": "Groq Inference Engine",
"version": "1.0.0",
"main": "dist/index.js",
"description": "This extension enables fast Groq chat completion API calls",
"url": "extension://@janhq/inference-groq-extension/dist/index.js"
},
"@janhq/inference-openai-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-openai-extension-1.0.0.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/inference-openai-extension",
"productName": "OpenAI Inference Engine",
"version": "1.0.0",
"main": "dist/index.js",
"description": "This extension enables OpenAI chat completion API calls",
"url": "extension://@janhq/inference-openai-extension/dist/index.js"
},
"@janhq/inference-triton-trt-llm-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-triton-trt-llm-extension-1.0.0.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/inference-triton-trt-llm-extension",
"productName": "Triton-TRT-LLM Inference Engine",
"version": "1.0.0",
"main": "dist/index.js",
"description": "This extension enables Nvidia's TensorRT-LLM as an inference engine option",
"url": "extension://@janhq/inference-triton-trt-llm-extension/dist/index.js"
},
"@janhq/huggingface-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-huggingface-extension-1.0.0.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/huggingface-extension",
"productName": "HuggingFace",
"version": "1.0.0",
"main": "dist/index.js",
"description": "Hugging Face extension for converting HF models to GGUF",
"url": "extension://@janhq/huggingface-extension/dist/index.js"
},
"@janhq/monitoring-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-monitoring-extension-1.0.10.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/monitoring-extension",
"productName": "System Monitoring",
"version": "1.0.10",
"main": "dist/index.js",
"description": "This extension provides system health and OS level data",
"url": "extension://@janhq/monitoring-extension/dist/index.js"
},
"@janhq/assistant-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-assistant-extension-1.0.1.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/assistant-extension",
"productName": "Jan Assistant",
"version": "1.0.1",
"main": "dist/index.js",
"description": "This extension enables assistants, including Jan, a default assistant that can call all downloaded models",
"url": "extension://@janhq/assistant-extension/dist/index.js"
},
"@janhq/tensorrt-llm-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-tensorrt-llm-extension-0.0.3.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/tensorrt-llm-extension",
"productName": "TensorRT-LLM Inference Engine",
"version": "0.0.3",
"main": "dist/index.js",
"description": "This extension enables Nvidia's TensorRT-LLM for the fastest GPU acceleration. See the [setup guide](https://jan.ai/guides/providers/tensorrt-llm/) for next steps.",
"url": "extension://@janhq/tensorrt-llm-extension/dist/index.js"
},
"@janhq/inference-nitro-extension": {
"_active": true,
"listeners": {},
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-nitro-extension-1.0.0.tgz",
"installOptions": {
"version": false,
"fullMetadata": true
},
"name": "@janhq/inference-nitro-extension",
"productName": "Nitro Inference Engine",
"version": "1.0.0",
"main": "dist/index.js",
"description": "This extension embeds Nitro, a lightweight (3mb) inference engine written in C++. See https://nitro.jan.ai.\nAdditional dependencies could be installed to run without Cuda Toolkit installation.",
"url": "extension://@janhq/inference-nitro-extension/dist/index.js"
}
}

Specific Extension Settings

Jan offers an Extensions settings menu for configuring extensions that have registered their settings within the application. Here, you can directly integrate Remote Inference Engines with Jan without inserting the URL and API Key directly in the JSON file. Additionally, you can turn the Logging extensions available on or off in Jan. To access the Extension settings, follow the steps below:

  1. Navigate to the main dashboard.
  2. Click the gear icon (⚙️) on the bottom left of your screen.

Settings


  1. The registered extension settings can be customized under the Extensions section.

Extensions

System Monitor Extension Feature

The System Monitor extension now offers enhanced customization for app logging. Users can toggle the application logging feature on or off and set a custom interval for clearing the app logs. To configure the app log feature, follow these steps:

  1. Navigate to the main dashboard.
  2. Click the Gear Icon (⚙️) on the bottom left of your screen.

Settings


  1. Under the Extensions section, select the System Monitoring extension.

System Monitoring extension


  1. Use the slider to turn the app logging feature on or off.

System Monitoring Enable


  1. Specify the log cleaning interval in milliseconds.

System Monitoring Interval


  • You can clear the app logs manually by clicking the Clear logs button in the advanced settings.
  • There are no minimum or maximum intervals for setting the time. However, invalid inputs will default to 120000ms (2 minutes).