ChatTongyi
Tongyi Qwen is a large language model developed by Alibaba’s Damo
Academy. It is capable of understanding user intent through natural
language understanding and semantic analysis, based on user input in
natural language. It provides services and assistance to users in
different domains and tasks. By providing clear and detailed
instructions, you can obtain results that better align with your
expectations. In this notebook, we will introduce how to use langchain
with Tongyi mainly in Chat
corresponding to the package langchain/chat_models
in langchain
# Install the package
!pip install dashscope
# Get a new token: https://help.aliyun.com/document_detail/611472.html?spm=a2c4g.2399481.0.0
from getpass import getpass
DASHSCOPE_API_KEY = getpass()
········
import os
os.environ["DASHSCOPE_API_KEY"] = DASHSCOPE_API_KEY
from langchain.chat_models.tongyi import ChatTongyi
from langchain.schema import HumanMessage
chatLLM = ChatTongyi(
streaming=True,
)
res = chatLLM.stream([HumanMessage(content="hi")], streaming=True)
for r in res:
print("chat resp:", r)
chat resp: content='Hello! How' additional_kwargs={} example=False
chat resp: content=' can I assist you today?' additional_kwargs={} example=False
from langchain.schema import HumanMessage, SystemMessage
messages = [
SystemMessage(
content="You are a helpful assistant that translates English to French."
),
HumanMessage(
content="Translate this sentence from English to French. I love programming."
),
]
chatLLM(messages)
AIMessageChunk(content="J'aime programmer.", additional_kwargs={}, example=False)