OpenAI function calling support (#1011)

This commit is contained in:
Sidharth Mohanty
2023-12-18 19:34:15 +05:30
committed by GitHub
parent 33dcfe42b5
commit cd2c40a9c4
3 changed files with 147 additions and 13 deletions

View File

@@ -60,9 +60,129 @@ llm:
top_p: 1
stream: false
```
</CodeGroup>
### Function Calling
To enable [function calling](https://platform.openai.com/docs/guides/function-calling) in your application using embedchain and OpenAI, you need to pass functions into `OpenAILlm` class as an array of functions. Here are several ways in which you can achieve that:
Examples:
<Accordion title="Using Pydantic Models">
```python
import os
from embedchain import Pipeline as App
from embedchain.llm.openai import OpenAILlm
import requests
from pydantic import BaseModel, Field, ValidationError, field_validator
os.environ["OPENAI_API_KEY"] = "sk-xxx"
class QA(BaseModel):
"""
A question and answer pair.
"""
question: str = Field(
..., description="The question.", example="What is a mountain?"
)
answer: str = Field(
..., description="The answer.", example="A mountain is a hill."
)
person_who_is_asking: str = Field(
..., description="The person who is asking the question.", example="John"
)
@field_validator("question")
def question_must_end_with_a_question_mark(cls, v):
"""
Validate that the question ends with a question mark.
"""
if not v.endswith("?"):
raise ValueError("question must end with a question mark")
return v
@field_validator("answer")
def answer_must_end_with_a_period(cls, v):
"""
Validate that the answer ends with a period.
"""
if not v.endswith("."):
raise ValueError("answer must end with a period")
return v
llm = OpenAILlm(config=None,functions=[QA])
app = App(llm=llm)
result = app.query("Hey I am Sid. What is a mountain? A mountain is a hill.")
print(result)
```
</Accordion>
<Accordion title="Using OpenAI JSON schema">
```python
import os
from embedchain import Pipeline as App
from embedchain.llm.openai import OpenAILlm
import requests
from pydantic import BaseModel, Field, ValidationError, field_validator
os.environ["OPENAI_API_KEY"] = "sk-xxx"
json_schema = {
"name": "get_qa",
"description": "A question and answer pair and the user who is asking the question.",
"parameters": {
"type": "object",
"properties": {
"question": {"type": "string", "description": "The question."},
"answer": {"type": "string", "description": "The answer."},
"person_who_is_asking": {
"type": "string",
"description": "The person who is asking the question.",
}
},
"required": ["question", "answer", "person_who_is_asking"],
},
}
llm = OpenAILlm(config=None,functions=[json_schema])
app = App(llm=llm)
result = app.query("Hey I am Sid. What is a mountain? A mountain is a hill.")
print(result)
```
</Accordion>
<Accordion title="Using actual python functions">
```python
import os
from embedchain import Pipeline as App
from embedchain.llm.openai import OpenAILlm
import requests
from pydantic import BaseModel, Field, ValidationError, field_validator
os.environ["OPENAI_API_KEY"] = "sk-xxx"
def find_info_of_pokemon(pokemon: str):
"""
Find the information of the given pokemon.
Args:
pokemon: The pokemon.
"""
req = requests.get(f"https://pokeapi.co/api/v2/pokemon/{pokemon}")
if req.status_code == 404:
raise ValueError("pokemon not found")
return req.json()
llm = OpenAILlm(config=None,functions=[find_info_of_pokemon])
app = App(llm=llm)
result = app.query("Tell me more about the pokemon pikachu.")
print(result)
```
</Accordion>
## Google AI
To use Google AI model, you have to set the `GOOGLE_API_KEY` environment variable. You can obtain the Google API key from the [Google Maker Suite](https://makersuite.google.com/app/apikey)