Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Undefined array key "openai-processing-ms" #218

Closed
DC-Sebastian opened this issue Sep 27, 2023 · 5 comments
Closed

Undefined array key "openai-processing-ms" #218

DC-Sebastian opened this issue Sep 27, 2023 · 5 comments
Assignees

Comments

@DC-Sebastian
Copy link

I am trying to use the AI provider Anyscale, which provides Metas Llama-2 models and returns them in the same structure as Open AI. With the official Python Open AI library this is possible by simply setting the OPENAI_API_BASE URI to https://api.endpoints.anyscale.com/v1.

When I try to use the base uri api.endpoints.anyscale.com/v1 with openai-php I get the error message:
Undefined array key "openai-processing-ms" in C:\xampp\htdocs\chatgpt\vendor\openai-php\client\src\Responses\Meta\MetaInformation.php on line 37

Any idea what could be causing this?

Here is the code I use for this:
$client = OpenAI::factory() ->withApiKey($yourApiKey) ->withBaseUri('api.endpoints.anyscale.com/v1') ->make();

@pb30
Copy link

pb30 commented Sep 27, 2023

This library expects a "openai-processing-ms" header in the response, but Anyscale does not include one

@DC-Sebastian
Copy link
Author

DC-Sebastian commented Sep 27, 2023

Is there a possibility to remove the dependency for the header?
The possibility of being able to use the Llama 2 models is quite useful.

@sstalle
Copy link

sstalle commented Oct 5, 2023

@DC-Sebastian You can add the header to a response before it is processed by the library. The exact way to do this would depend on the HTTP client you are using. Here is a quick example for Guzzle:

use GuzzleHttp\Client;
use GuzzleHttp\HandlerStack;
use GuzzleHttp\Middleware;
use Psr\Http\Message\ResponseInterface;

$stack = HandlerStack::create();
$stack->push(
    Middleware::mapResponse(function (ResponseInterface $response) {
        return $response->withHeader('openai-processing-ms', '1');
    })
);
$guzzle = new Client(['handler' => $stack]);

$yourApiKey = '...';
$client = \OpenAI::factory()
    ->withApiKey($yourApiKey)
    ->withBaseUri('api.endpoints.anyscale.com/v1')
    ->withHttpClient($guzzle)
    ->make();

@gehrisandro gehrisandro self-assigned this Oct 6, 2023
@bianchi
Copy link

bianchi commented Oct 18, 2023

This started happening for me too. I'm using Azure OpenAi.

@rkgo55
Copy link

rkgo55 commented Oct 20, 2023

Hello.
I also got the same error when using Azure OpenAi.
I think the problem occurs when sending consecutive requests.

The same error may occur in the future due to specification changes on the API side.
Therefore, it is a good idea to confirm the existence of all headers in the part where the header is obtained.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants