Home  >  Q&A  >  body text

Why is the OpenAI Chat GPT (GPT-3.5) API unresponsive when the stream parameter is set to false?

You can see the $prompt value in my application below. When I enter this prompt value, chat GPT gives no results. But this is because of "stream" => false in the parameter. If "stream" => true, chat GPT gives the result.

My question is why chat GPT doesn't give results when "stream" => false. And what to do to produce results.

$API_KEY = "API_KEY_HERE";

$model = 'gpt-3.5-turbo';
$header = [
    "Authorization: Bearer " . $API_KEY,
    "Content-type: application/json",
];

$temperature = 0.6;
$frequency_penalty = 0;
$presence_penalty= 0;
$prompt = 'What can you help me with? For example: What do you suggest to keep me motivated?';
 

$messages = array(
    array(
        "role" => "system",
        "content" => "Your name is 'JOHN DOE'. I want you to act as a motivational coach. I will provide you with some information about someone's goals and challenges, and it will be your job to come up with strategies that can help this person achieve their goals. This could involve providing positive affirmations, giving helpful advice or suggesting activities they can do to reach their end goal. if you don't understand the question, don't think too much, tell the user to be more specific with more details"
        ),
    array(
        "role" => "assistant",
        "content" => "Hello, I'm JOHN DOE, and I'm a motivational coach who loves helping people find their drive and achieve their goals. With years of experience in coaching and personal development, I've developed a unique approach to motivation that combines mindset, energy, and action."
    ),
    array(
        "role" => "user",
        "content" => $prompt
    )
);
//Turbo model
$isTurbo = true;
$url = "https://api.openai.com/v1/chat/completions";
$params = json_encode([
    "messages" => $messages,
    "model" => $model,
    "temperature" => $temperature,
    "max_tokens" => 1024,
    "frequency_penalty" => $frequency_penalty,
    "presence_penalty" => $presence_penalty,
    "stream" => false
]);

$curl = curl_init($url);
$options = [
    CURLOPT_POST => true,
    CURLOPT_HTTPHEADER => $header,
    CURLOPT_POSTFIELDS => $params,
    CURLOPT_RETURNTRANSFER => true,
    CURLOPT_SSL_VERIFYPEER => false,
    CURLOPT_SSL_VERIFYHOST => 0,
    CURLOPT_WRITEFUNCTION => function($curl, $data) {
        //echo $curl;
        $httpCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);

        if ($httpCode != 200) {
            $r = json_decode($data);
            echo 'data: {"error": "[ERROR]","message":"'.$r->error->message.'"}' . PHP_EOL;
        } else {
            $trimmed_data = trim($data); 
            if ($trimmed_data != '') {
                $response_array = json_decode($trimmed_data, true);
                $content = $response_array['choices'][0]['message']['content'];
                echo $content;
                ob_flush();
                flush();
            }
        }
        return strlen($data);
    },
];

curl_setopt_array($curl, $options);
$response = curl_exec($curl);

if ($response === false) {
    echo 'data: {"error": "[ERROR]","message":"'.curl_error($curl).'"}' . PHP_EOL;
}else{

}


P粉865900994P粉865900994318 days ago884

reply all(1)I'll reply

  • P粉466290133

    P粉4662901332023-11-11 09:20:17

    The reason why you will not get a response if you set "stream" => false is that the entire code is designed to work when the Stream parameter is set to true.

    With the following modifications, the response will be processed as a whole, regardless of the value of the stream parameter.

    Try this:

    $API_KEY = "API_KEY_HERE";
    
    $model = 'gpt-3.5-turbo';
    $header = [
        "Authorization: Bearer " . $API_KEY,
        "Content-type: application/json",
    ];
    
    $temperature = 0.6;
    $frequency_penalty = 0;
    $presence_penalty= 0;
    $prompt = 'What can you help me with? For example: What do you suggest to keep me motivated?';
    
    $messages = array(
        array(
            "role" => "system",
            "content" => "Your name is 'JOHN DOE'. I want you to act as a motivational coach. I will provide you with some information about someone's goals and challenges, and it will be your job to come up with strategies that can help this person achieve their goals. This could involve providing positive affirmations, giving helpful advice or suggesting activities they can do to reach their end goal. if you don't understand the question, don't think too much, tell the user to be more specific with more details"
            ),
        array(
            "role" => "assistant",
            "content" => "Hello, I'm JOHN DOE, and I'm a motivational coach who loves helping people find their drive and achieve their goals. With years of experience in coaching and personal development, I've developed a unique approach to motivation that combines mindset, energy, and action."
        ),
        array(
            "role" => "user",
            "content" => $prompt
        )
    );
    
    $url = "https://api.openai.com/v1/chat/completions";
    
    $params = json_encode([
        "messages" => $messages,
        "model" => $model,
        "temperature" => $temperature,
        "max_tokens" => 1024,
        "frequency_penalty" => $frequency_penalty,
        "presence_penalty" => $presence_penalty,
        "stream" => false
    ]);
    
    $curl = curl_init($url);
    $options = [
        CURLOPT_POST => true,
        CURLOPT_HTTPHEADER => $header,
        CURLOPT_POSTFIELDS => $params,
        CURLOPT_RETURNTRANSFER => true,
        CURLOPT_SSL_VERIFYPEER => false,
        CURLOPT_SSL_VERIFYHOST => 0,
    ];
    
    curl_setopt_array($curl, $options);
    $response = curl_exec($curl);
    
    if ($response === false) {
        echo 'data: {"error": "[ERROR]","message":"'.curl_error($curl).'"}' . PHP_EOL;
    }else{
        $response_array = json_decode($response, true);
        $content = $response_array['choices'][0]['message']['content'];
        echo $content;
    }

    reply
    0
  • Cancelreply