Build Your Own ChatGPT-4-Coding-Copilot Using the New OpenAI Function Calling
In this article, we want to investigate how we can build a very simple coding-copilot using OpenAI function calling.
OpenAI function calling allows you to send abstract functions to the endpoint, and the model will intelligently choose a function and its respective parameters to call. The model will not call the function directly, rather you have to parse the model output and then use the model’s answer to call the function in your code.
But what should a coding companion be able to do?
Well, for a very basic coding companion, it should be able to fulfill two tasks:
- Write code
- Read code
Implementing the “write code” functionality
Firstly, we are going to need a python function, which can write a string to a file.
Prompt: Write a python function called “write to file” which writes an input string to a specific file.
Output:
def write_to_file(input_string, file_name):
try:
with open(file_name, 'w') as file:
file.write(input_string)
print(f'Successfully wrote to {file_name}')
except Exception as e:
print(f'Error writing to {file_name}: {e}')
# Example usage:
content_to_write = "Hello, this is a sample content."
file_to_write = "example.txt"
write_to_file(content_to_write, file_to_write)
While this is already very helpful, we need to change all print()
-statements to return statements. This way, out copilot can also read the function output. Our updated code then will look like this:
def write_to_file(input_string, file_name):
try:
with open(file_name, 'w') as file:
file.write(input_string)
return(f'Successfully wrote to {file_name}')
except Exception as e:
return(f'Error writing to {file_name}: {e}')
Implementing the “read code” functionality
Prompt: Can you write a function for me, which can read the contents of a file by filename?
Output:
def read_from_file(file_name):
try:
with open(file_name, 'r') as file:
content = file.read()
return content
except FileNotFoundError:
print(f'The file {file_name} does not exist.')
except Exception as e:
print(f'Error reading from {file_name}: {e}')
return None
# Example usage:
file_to_read = "example.txt"
read_content = read_from_file(file_to_read)
if read_content is not None:
print(f'Contents of {file_to_read}:\n{read_content}')
Again, we will switch all print()
-statements to return
-statements:
def read_from_file(file_name):
try:
with open(file_name, 'r') as file:
content = file.read()
return content
except FileNotFoundError:
return f'The file {file_name} does not exist.'
except Exception as e:
return f'Error reading from {file_name}: {e}'
We will add one more function: if the file to be read cannot be found, we would like the companion to also be able to list the contents of the directory.
Prompt: Can you write a function for me, which can read the contents of a file by filename?
Output (already with modified return-statements):
import os
def list_directory_contents(directory_path="./"):
try:
# Get the list of files and subdirectories in the specified directory
contents = os.listdir(directory_path)
return f'Contents of {directory_path}:', contents
except FileNotFoundError:
return(f'The directory {directory_path} does not exist.')
except Exception as e:
return(f'Error listing contents of {directory_path}: {e}')
Designing our Copilot
ChatGPT function calling requires us to send the definitions of our function as an attachment. These function descriptions are to be supplied in JSON:
tools = [
{
"type": "function",
"function": {
"name": "write_to_file",
"description": "Write contents to a file.",
"parameters": {
"type": "object",
"properties": {
"input_string": {
"type": "string",
"description": "The contents you want to write to the file",
},
"file_name": {
"type": "string",
"description": "The name of the file you want to write contents to."},
},
"required": ["input_string", "file_name"],
},
},
},
{
"type": "function",
"function": {
"name": "read_from_file",
"description": "Read contents of a file.",
"parameters": {
"type": "object",
"properties": {
"file_name": {
"type": "string",
"description": "The name of the file you want read."},
},
"required": ["file_name"],
},
},
},
{
"type": "function",
"function": {
"name": "list_directory_contents",
"description": "List all the filenames in a directory.",
"parameters": {
"type": "object",
"properties": {
"dummy_property": {
"type": "null",
},
},
},
},
}
]
Now, we will need an access point to the OpenAI-API:
client = OpenAI(api_key="sk-...")
response = client.chat.completions.create(
model="gpt-3.5-turbo-1106",
messages=messages,
tools=tools,
tool_choice="auto",
)
And finally, we also implement a way to parse the LLM-output and call our previously-defined functions:
response = client.chat.completions.create(
model="gpt-3.5-turbo-1106",
messages=messages,
tools=tools,
tool_choice="auto",
)
response_message = response.choices[0].message
tool_calls = response_message.tool_calls
# check if the model wanted to call a function
if tool_calls:
# call the function
messages.append(response_message)
for tool_call in tool_calls:
function_name = tool_call.function.name
function_args = json.loads(tool_call.function.arguments)
if function_name == "write_to_file":
function_response = write_to_file(
input_string=function_args.get("input_string"),
file_name=function_args.get("file_name"),
)
elif function_name == "read_from_file":
function_response = read_from_file(
file_name=function_args.get("file_name"),
)
elif function_name == "list_directory_contents":
function_response = list_directory_contents()
messages.append(
{
"tool_call_id": tool_call.id,
"role": "tool",
"name": function_name,
"content": function_response,
}
)
print(function_response)
Note: We wrap all of the functionality in the method run_companion
.
Testing our Companion
We made it! Now it’s time to test our companion.
First, we want to give our companion a proper introduction to the world of programming:
run_companion("Create a new file called hello world with a python script that prints 'Hello World!'.")
> Successfully wrote to hello_world.py
Now, our companion can check for us, if the file was actually added:
run_companion("What are the contents of the current directory?")
> ('Contents of ./:', ['hello_world.py', 'notebook.ipynb', 'env', '.env', 'example.txt'])
And finally, let’s have our companion check the newly created code:
run_companion("What does the file called hello_world.py say?")
> print('Hello World!')
These quick tests are designed to showcase the very basic functionality of the companion. Without adding further functionality, this companion can already:
- Write documentation
- Explain scripts
- Write test cases
- Suggest optimizations
Outlook
Of course, this does not cover all the “functionality” of a real programmer. Our finished companion can be extended simply by adding more functions. Some ideas for extending the compation might be:
- Read documentation
- Run code
- Search StackOverFlow
- Post to forums