2024-05-14 16:14:13 +08:00
# Large Language Models (LLMs)
2024-05-14 18:11:57 +08:00
Agents-Flex provides an abstract implementation interface for Large Language Models (LLMs) called `Llm.java` ,
supporting two different chat types: `chat` and `chatStream` .
2024-05-14 16:14:13 +08:00
For various vendors, Agents-Flex offers different implementation classes and communication protocols, including `HTTP` , `SSE` , and `WebSocket` clients.
2024-05-14 18:11:57 +08:00
## Chat
2024-05-14 16:14:13 +08:00
2024-05-14 18:11:57 +08:00
In chat conversations of AI LLMs, we need to consider several different scenarios:
2024-05-14 16:14:13 +08:00
2024-05-14 18:11:57 +08:00
- Simple chat
- Chat with Histories
2024-05-14 16:14:13 +08:00
- Function Calling
These capabilities are determined by prompts, so Agents-Flex provides three implementations of prompts:
2024-05-14 18:11:57 +08:00
- SimplePrompt: Used for simple chat
- HistoriesPrompt: Used for chat with Histories
- FunctionPrompt: Used for Function Calling
2024-05-14 16:14:13 +08:00
2024-05-14 18:11:57 +08:00
During the interaction between prompts and LLMs, messages are exchanged. Therefore, Agents-Flex also provides different message implementations:
2024-05-14 16:14:13 +08:00
2024-05-14 18:11:57 +08:00
- **AiMessage**: The message returned by the LLMs, which includes not only the message content but also data such as token consumption.
- **FunctionMessage**: A subclass of AiMessage, returned when using FunctionPrompt in the `chat` method.
- **HumanMessage**: Represents messages input by Human during conversations.
- **SystemMessage**: Represents system messages, often used to inform the LLM's role, for fine-tuning prompts.
2024-05-14 16:14:13 +08:00
### Example Code
2024-05-14 18:11:57 +08:00
**Simple Chat**
2024-05-14 16:14:13 +08:00
```java
public static void main(String[] args) {
Llm llm = new OpenAiLlm.of("sk-rts5NF6n*******");
Prompt prompt = new SimplePrompt("what is your name?");
String response = llm.chat(prompt);
System.out.println(response);
}
```
2024-05-14 18:11:57 +08:00
**Chat with Histories**
2024-05-14 16:14:13 +08:00
```java
public static void main(String[] args) {
Llm llm = new OpenAiLlm.of("sk-rts5NF6n*******");
HistoriesPrompt prompt = new HistoriesPrompt();
prompt.addMessage(new SystemMessage("You are now a database development engineer...."));
prompt.addMessage(new HumanMessage("Please provide the DDL content for...."));
String response = llm.chat(prompt);
System.out.println(response);
}
```
**Function Calling**
Utility class definition:
```java
public class WeatherUtil {
@FunctionDef (name = "get_the_weather_info", description = "get the weather info")
public static String getWeatherInfo(
@FunctionParam (name = "city", description = "the city name") String name) {
2024-05-14 18:11:57 +08:00
//Here, we should call API through third-party interfaces.
2024-05-14 16:14:13 +08:00
return name + "The weather is cloudy turning to overcast. ";
}
}
```
2024-05-14 18:11:57 +08:00
Create FunctionPrompt and pass it to the LLMs through the `chat` method:
2024-05-14 16:14:13 +08:00
```java
public static void main(String[] args) {
OpenAiLlmConfig config = new OpenAiLlmConfig();
config.setApiKey("sk-rts5NF6n*******");
OpenAiLlm llm = new OpenAiLlm(config);
2024-05-14 18:11:57 +08:00
FunctionPrompt prompt = new FunctionPrompt("how's the weather in New York?", WeatherUtil.class);
2024-05-14 16:14:13 +08:00
FunctionResultResponse response = llm.chat(prompt);
//Execute utility class method to obtain result.
2024-06-14 20:09:42 +08:00
Object result = response.getFunctionResult();
2024-05-14 16:14:13 +08:00
System.out.println(result);
//"The weather in New York is cloudy turning overcast."
}
```