This guide providers more details about creating and setting AI api keys in your .env file for using Xyne’s chat feature. To use AI chat, you need to configure access credentials. You can use one of the following providers: AWS, OpenAI, Ollama, Gemini, Together AI or Fireworks AI.Documentation Index
Fetch the complete documentation index at: https://docs.xynehq.com/llms.txt
Use this file to discover all available pages before exploring further.
AWS (Amazon Bedrock)
- Sign in to the AWS Management Console
- Generate your
Access KeyandSecret Key. - Add them as environment variables in your
.env.defaultfile:.env.defaultTheAWS_REGIONkey is optional in the.env.defaultfile; if not provided, it defaults tous-west-2
OpenAI
- Create an API key from the OpenAI API settings.
- Add it as environment variables in your .env.default file:
.env.default
Ollama (Local LLM)
- Install Ollama and the desired model from Ollama.
- Add the model name in your .env.default file:
.env.default
Google AI
- Generate your API key from Google AI Studio
- Add them as environment variables in your
.env.defaultfile:.env.default
Together AI
- Create your API key in Together AI
- Select the model from your choice.
- Add the following to your .env.default file:
.env.default
Fireworks AI
- Create your API key in Fireworks AI
- Select the model from your choice.
- Add the following to your .env.default file:
.env.default
Setting up for Reasoning [Together | Fireworks | Ollama] :
For addition of only reasoning in your model, you need to add REASONING in your .env.default file:.env.default
For example :
Reasoning with Together AI :
.env.default
Reasoning with Fireworks AI :
.env.default
Reasoning with Ollama :
.env.default
Additional BASE URL support
We’ve also added support for addingbase url values for OpenAI and Together AI, along with your Open AI or Together API Key, you can the base url value: