Konko
This page covers how to run models on Konko within LangChain.
Konko API is a fully managed API designed to help application developers:
Select the right LLM(s) for their application Prototype with various open-source and proprietary LLMs Move to production in-line with their security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant infrastructure
Installation and Setupβ
First you'll need an API keyβ
You can request it by messaging support@konko.ai
Install Konko AI's Python SDKβ
1. Enable a Python3.8+ environmentβ
2. Set API Keysβ
Option 1: Set Environment Variablesβ
You can set environment variables for
- KONKO_API_KEY (Required)
- OPENAI_API_KEY (Optional)
In your current shell session, use the export command:
export KONKO_API_KEY={your_KONKO_API_KEY_here}
export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional
Alternatively, you can add the above lines directly to your shell startup script (such as .bashrc or .bash_profile for Bash shell and .zshrc for Zsh shell) to have them set automatically every time a new shell session starts.
Option 2: Set API Keys Programmaticallyβ
If you prefer to set your API keys directly within your Python script or Jupyter notebook, you can use the following commands:
konko.set_api_key('your_KONKO_API_KEY_here')
konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional
3. Install the SDKβ
pip install konko
4. Verify Installation & Authenticationβ
#Confirm konko has installed successfully
import konko
#Confirm API keys from Konko and OpenAI are set properly
konko.Model.list()
Calling a modelβ
Find a model on the Konko Introduction page
For example, for this LLama 2 model. The model id would be: "meta-llama/Llama-2-13b-chat-hf"
Another way to find the list of models running on the Konko instance is through this endpoint.
From here, we can initialize our model:
chat_instance = ChatKonko(max_tokens=10, model = 'meta-llama/Llama-2-13b-chat-hf')
And run it:
msg = HumanMessage(content="Hi")
chat_response = chat_instance([msg])