The @nexys/openai-client
is a simple and efficient TypeScript client for interacting with OpenAI's cutting-edge AI models, such as GPT-3.5 Turbo, directly from your TypeScript or JavaScript applications. This npm package wraps the OpenAI API, providing a clean and type-safe interface for sending chat requests to the API.
- Send chat completion requests to OpenAI API with just a few lines of code.
- TypeScript support for a strongly typed development experience.
- Provides simple error handling and efficient request sending.
- Supports the OpenAI's "gpt-3.5-turbo" model.
- Node.js (v18.0.0 or newer, no dependencies)
- An OpenAI API key
yarn install @nexys/openai-client
First, import the client and necessary types:
import { getChatCompletion, Message, OpenAiModel } from '@nexys/openai-client';
Next, prepare your messages and call getChatCompletion
:
const messages: Message[] = [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Who won the world series in 2020?' }
];
const model: OpenAiModel = "gpt-3.5-turbo";
try {
const response = await getChatCompletion(messages, model, 'YOUR_OPENAI_API_KEY');
console.log(response.choices[0].message.content);
} catch (err) {
console.error(err);
}
Contributions are warmly welcomed! Refer to the CONTRIBUTING.md for details on how to contribute and our CODE_OF_CONDUCT.md for the code of conduct.
If you encounter any issues, please file an issue on our issue tracker.
This project is licensed under the MIT license. See the LICENSE file for the full license text.
This is an unofficial client library and is not officially endorsed by OpenAI. Please consult the OpenAI terms of use for more details.