Function calling in OpenAI: How to connect LLMs with internal tools and improve models

Function calling in OpenAI: How to connect LLMs with internal tools and improve models

Function calling in OpenAI: How to connect LLMs with internal tools and improve models

New mobile apps to keep an eye on

Hendrerit enim egestas hac eu aliquam mauris at viverra id mi eget faucibus sagittis, volutpat placerat viverra ut metus velit, velegestas pretium sollicitudin rhoncus ullamcorper ullamcorper venenatis sed vestibulum eu quam pellentesque aliquet tellus integer curabitur pharetra integer et ipsum nunc et facilisis etiam vulputate blandit ultrices est lectus eget urna, non sed lacus tortor etamet sed sagittis id porttitor parturient posuere.

  1. Lorem ipsum dolor sit amet consectetur rhoncus ullamcorper ullamcorper
  2. Mauris aliquet faucibus iaculis dui vitae ullamco
  3. Posuere enim mi pharetra neque proin vulputate blandit ultrices
  4. Posuere enim mi pharetra neque  pellentesque aliquet tellus proindi

What new social media mobile apps are available in 2023?

Sollicitudin rhoncus ullamcorper ullamcorper venenatis sed vestibulum eu quam pellentesque aliquet tellus integer curabitur pharetra integer et ipsum nunc et facilisis etiam vulputate blandit ultrices est lectus vulputate eget urna, non sed lacus tortor etamet sed sagittis id porttitor parturient posuere.

Posuere enim mi pharetra neque proin vulputate blandit ultrices

Use new social media apps as marketing funnels

Eget lorem dolor sed viverra ipsum nunc aliquet bibendum felis donec et odio pellentesque diam volutpat commodo sed egestas aliquam sem fringilla ut morbi tincidunt augue interdum velit euismod eu tincidunt tortor aliquam nulla facilisi aenean sed adipiscing diam donec adipiscing ut lectus arcu bibendum at varius vel pharetra nibh venenatis cras sed felis eget.

  • Lorem ipsum dolor sit amet consectetur fringilla ut morbi tincidunt.
  • Mauris aliquet faucibus iaculis dui vitae ullamco neque proin vulputate interdum.
  • Posuere enim mi pharetra neque proin  bibendum felis donec et odio.
  • Posuere enim mi pharetra neque proin aliquam mauris at viverra id mi eget.
“Eget lorem dolor sed viverra ipsum nunc aliquet bibendum felis donec et odio pellentesque diam volutpat.”
Try out Twitter Spaces or Clubhouse on iPhone

Nisi quis eleifend quam adipiscing vitae aliquet bibendum enim facilisis gravida neque velit euismod in pellentesque massa placerat volutpat lacus laoreet non curabitur gravida odio aenean sed adipiscing diam donec adipiscing tristique risus amet est placerat in egestas erat imperdiet sed euismod nisi.

What app are you currently experimenting on?

Eget lorem dolor sed viverra ipsum nunc aliquet bibendum felis donec et odio pellentesque diam volutpat commodo sed egestas aliquam sem fringilla ut morbi tincidunt augue interdum velit euismod eu tincidunt tortor aliquam nulla facilisi aenean sed adipiscing diam donec adipiscing ut lectus arcu bibendum at varius vel pharetra nibh venenatis cras sed felis eget.

The new Function calling feature by OpenAI combines the LLMs’ greatest capability of understanding human language with any existing or new internal tools you are already using in your system.

Here’s how tools work today:

Here’s how Function Calling from OpenAI API can make them better:

Function Calling Explained

In an API call, you can have multiple functions described and have the LLM model intelligently choose to output a JSON object containing arguments to call one or many of the functions. 

NOTE: The Chat Completions API is not the one to call the function. The model generates a JSON code, which you can use to call the function in your code.

Use Cases for Function Calling

Here are some examples of Function Calling

Natural Language Understanding

  • Create a function that sends a text input to a GPT model through an API. The response can include JSON-formatted data containing information extracted from the text, such as named entities, sentiment analysis, and keywords. E.g. convert "Show me my latest prospects with highest revenue?" to get_prospects(min_revenue: int, created_before: string, limit: int) and call your internal API
Python


import requests
    def analyze_text_with_gpt(text):
      api_url = "https://your-gpt-api-url"
      data = {"text": text}
    response = requests.post(api_url, json=data)
    result = response.json()
    return result





Chatbots:

  • Implement a function for a chatbot that takes user messages as input and returns JSON responses. The JSON response can include the chatbot's reply and additional information like confidence scores or context.
Python


import requests

def chat_with_gpt(user_message, chat_context):
    api_url = "https://your-gpt-api-url"
    data = {
        "user_message": user_message,
        "chat_context": chat_context
    }
    response = requests.post(api_url, json=data)
    result = response.json()
    return result




Question-Answering System

  • Build a function for a question-answering system that takes a question and a context passage as input. The JSON response can contain the answer to the question and any relevant context.
Python


import requests

def answer_question_with_gpt(question, context):
    api_url = "https://your-gpt-api-url"
    data = {
        "question": question,
        "context": context
    }
    response = requests.post(api_url, json=data)
    result = response.json()
    return result



Sentiment Analysis

  • Create a function that analyzes the sentiment of a text input and returns a JSON response with sentiment scores and labels (e.g., positive, negative, neutral).
Python


import requests

  def analyze_sentiment_with_gpt(text):
      api_url = "https://your-gpt-api-url"
      data = {"text": text}
    response = requests.post(api_url, json=data)
    result = response.json()
    return result




For these examples, you would need to replace "https://your-gpt-api-url" with the actual API endpoint provided by the GPT model service you are using. These functions make HTTP requests to the GPT model's API and parse the JSON responses for further processing.

Supported models

Function calling is supported with the following models:

  • gpt-4
  • gpt-4-1106-preview
  • gpt-4-0613
  • gpt-3.5-turbo
  • gpt-3.5-turbo-1106
  • gpt-3.5-turbo-0613

Parallel Function Calling

Parallel function calls are advantageous when you need to invoke multiple functions simultaneously. For instance, you might wish to trigger functions to retrieve weather information for three distinct locations concurrently. In such a scenario, the model executes multiple functions within a single response. You can then access the results of each function call by cross-referencing the 'tool_call_id' in the response with the corresponding ID for each tool call.

Supported models for parallel function calls:

  • gpt-4-1106-preview
  • gpt-3.5-turbo-1106

Cost of Function Calling

Functions are injected into the system message in a syntax the model has been trained on. This means functions count against the model's context limit and are billed as input tokens. If running into context limits, we suggest limiting the number of functions or the length of documentation you provide for function parameters.

Improvement of Function Calling Usage

Function calling needs constant monitoring in order to improve and optimize its application.

  • First, you need to see how many times each of your functions have been called.
  • Second, it would be great to receive user feedback on the output quality because checking it manually would be way difficult.

How to Track Function Usage

Track the progress of your OpenAI functions with GPTBoost!

If you are not certain how to keep track of the hundreds of modifications, which you implement on a daily basis, create a free account with GPTBoost. You'll be able to see which function have been called, prioritize your work and implement feedback and bug reporting to measure your progress.

Latest posts