Open Interpreter
Integrate Open Interpreter with Jan
Open Interpreter (opens in a new tab) lets LLMs run code (Python, Javascript, Shell, and more) locally. After installing, you can chat with Open Interpreter through a ChatGPT-like interface in your terminal by running interpreter
. To integrate Open Interpreter with Jan, follow the steps below:
Step 1: Install Open Interpreter
- Install Open Interpreter by running:
pip install open-interpreter
- A Rust compiler is required to install Open Interpreter. If not already installed, run the following command or go to this page (opens in a new tab) if you are running on Windows:
sudo apt install rustc
The Rust compiler is necessary for building some native extensions that Open Interpreter requires.
Step 2: Configure Jan's Local API Server
Before using Open Interpreter, configure the model in Settings
> My Model
for Jan and activate its local API server.
Enabling Jan API Server
-
Click the
<>
button to access the Local API Server section in Jan. -
Configure the server settings, including IP Port, Cross-Origin-Resource-Sharing (CORS), and Verbose Server Logs.
-
Click Start Server.
Step 3: Set the Open Interpreter Environment
- For integration, provide the API Base (
http://localhost:1337/v1
) and the model ID (e.g.,mistral-ins-7b-q4
) when running Open Interpreter. For example, see the code below:
interpreter --api_base http://localhost:1337/v1 --model mistral-ins-7b-q4
Open Interpreter is now ready for use!