AI Providers
AI Providers
Set-up AI API keys
This guide providers more details about creating and setting AI api keys in your .env file for using Xyne’s chat feature.
To use AI chat, you need to configure access credentials. You can use one of the following providers: AWS, OpenAI, Ollama, Together AI or Fireworks AI.
AWS (Amazon Bedrock)
- Sign in to the AWS Management Console
- Generate your
Access Key
andSecret Key
. - Add them as environment variables in your
.env.default
file:.env.defaultTheAWS_REGION
key is optional in the.env.default
file; if not provided, it defaults tous-west-2
OpenAI
- Create an API key from the OpenAI API settings.
- Add it as environment variables in your .env.default file:
.env.default
Ollama (Local LLM)
- Install Ollama and the desired model from Ollama.
- Add the model name in your .env.default file:
.env.default
If you’re using Ollama, ensure the specified model is downloaded locally.
Together AI
- Create your API key in Together AI
- Select the model from your choice.
- Add the following to your .env.default file:
.env.default
Fireworks AI
- Create your API key in Fireworks AI
- Select the model from your choice.
- Add the following to your .env.default file:
.env.default
Setting up for Reasoning [Together | Fireworks | Ollama] :
For addition of only reasoning in your model, you need to add REASONING in your .env.default file:
.env.default
For example :
Reasoning with Together AI :
.env.default
Reasoning with Fireworks AI :
.env.default
Reasoning with Ollama :
.env.default
Key Selection Order :
If multiple keys are set in the environment variable, our system follows this precedence:
Only anyone provider is required to get started. This means either AWS or OpenAI, Ollama, Together AI or Fireworks AI can be used, all of them aren’t required.