We recommend using EC2 Instance for the deployment as it is the simplest to set-up.
If you want to deploy Xyne on AWS instead of your local machine, this document will give you a detailed guide to do so.
Follow the steps listed below to get started :
Inside the server folder of the xyne folder, you will find a .env.default file, this is the .env file that our docker uses.
For the moment you will find some default generated environment variables that we’ve set up for the app to work.
We strictly recommend generating your own ENCRYPTION_KEY, SERVICE_ACCOUNT_ENCRYPTION_KEY and JWT_SECRET values for security.
Due to our agentic RAG implementation, the maximum TPM limit exceeds for Open AI’s gpt4o model.
For the best experience, we recommend using AWS Bedrock or Claude, as it enhances performance and accuracy.
In the .env.default file, you can modify the following and replace the missing values with your own :
.env.default file
Copy
ENCRYPTION_KEY=<YOUR_ENCRYPTION_KEY>SERVICE_ACCOUNT_ENCRYPTION_KEY=<YOUR_SERVICE_ACCOUNT_ENCRYPTION_KEY>GOOGLE_CLIENT_ID=<YOUR_GOOGLE_CLIENT_ID>GOOGLE_CLIENT_SECRET=<YOUR_GOOGLE_CLIENT_SECRET>GOOGLE_REDIRECT_URI=http://localhost:3001/v1/auth/callbackGOOGLE_PROD_REDIRECT_URI=<YOUR_Public_IPv4_DNS_ADDRESS>/v1/auth/callbackJWT_SECRET=<YOUR_JWT_SECRET>DATABASE_HOST=xyne-dbVESPA_HOST=vespa## If using AWS BedrockAWS_ACCESS_KEY=<YOUR_AWS_ACCESS_KEY>AWS_SECRET_KEY=<YOUR_AWS_ACCESS_SECRET>AWS_REGION=<YOUR_AWS_REGION>## OR [ If using Open AI ]OPENAI_API_KEY=<YOUR_OPEN_API_KEY>## OR [ If using Ollama ] OLLAMA_MODEL=<YOUR_OLLAMA_MODEL_NAME>## OR [ If using Together AI ] TOGETHER_API_KEY=<YOUR_TOGETHER_API_KEY>TOGETHER_MODEL=<YOUR_TOGETHER_MODEL>TOGETHER_FAST_MODEL=<YOUR_TOGETHER_FAST_MODEL>## OR [ If using Fireworks AI ] FIREWORKS_API_KEY=<YOUR_FIREWORKS_API_KEY>FIREWORKS_MODEL=<YOUR_FIREWORKS_MODEL>FIREWORKS_FAST_MODEL=<YOUR_FIREWORKS_FAST_MODEL>## OR [If using Google AI]GEMINI_API_KEY=<YOUR_GEMINI_API_KEY>GEMINI_MODEL=<YOUR_GEMINI_MODEL_NAME>## If you are using custom OpenAI or Together AI endpointsBASE_URL=<YOUR_BASE_URL>HOST=<YOUR_Public_IPv4_DNS_ADDRESS>
Ensure that these IPv4 address is the same as the one you’ve added in your Google Cloud Project
To use the chat feature of Xyne, you need any one AI provider (AWS, Ollama, OpenAI Together AI or Fireworks AI). Missing keys will disable chat functionality.
You can checkout the AI Providers section for better clarity :
Post Execution Setup [ Frontend Environment Variables]
After you have deployed your application, you need to setup some frontend variables for your application.
For this create a .env.production in your application, and then add the following :