Create different instructions for Docker and bare metal

This commit is contained in:
powermaker450 2024-07-25 17:21:38 -04:00
parent a84b68ec32
commit e912042490
3 changed files with 124 additions and 58 deletions

60
BARE.md Normal file
View file

@ -0,0 +1,60 @@
# Running Tailchat Assistant normally
1. Clone the project and install dependencies:
```bash
git clone https://git.povario.com/powermaker450/Tailchat-Assistant && cd Tailchat-Assistant
npm i
```
2. Create the `.env` file from the example file:
```bash
mv .env.example .env
```
3. **Make sure that the "Openapi Platform Plugin" and "App Integration" plugins are enabled.** Then [follow the instructions here to create an App.](https://tailchat.msgbyte.com/docs/advanced-usage/openapp/create)
4. Open up the `.env` file with your preffered text editor and populate the file:
```bash
HOST=http://localhost:11000
# wherever your Tailchat server is running
ID=
# your bot ID
SECRET=
# Your bot secret
API_ENDPOINT=http://localhost:8080/v1
# The OpenAI-compatible endpoint for the bot to send messsages to. Defaults to "http://localhost:8080/v1"
# e.x. "http://localhost:8080/v1", "https://api.openai.com/v1"
API_KEY=none
# Your API key here for OpenAI/LocalAI. Defaults to the string "none".
# MAKE SURE to fill this in with your OpenAI API Key or a key you may have set for LocalAI.
# If you didn't set the API key for LocalAI, you may leave this blank.
TEXT_MODEL=gpt-4
# The model to query when sending text messages. Defaults to "gpt-4"
# e.x. "gpt-3", "gpt-4"
CREATE_IMAGE_MODEL=stablediffusion-cpp
# The model to use when creating images. Defaults to "stablediffusion-cpp",
# e.x. "dall-e-3", "stablediffusion-cpp"
ANALYZE_IMAGE_MODEL=gpt-4-vision-preview
# The model to use when analyzing images. Defaults to "gpt4-vision-preview".
# e.x. "gpt-4-vision-preview", "llava"
ALLOWED_CHAT=
# The ID of the channel that the bot is allowed to respond in.
# The bot will always respond to Direct Messages.
SAFE_WORD=\
# When this character/string is detected anywhere in a message, the bot won't respond to it.
# Defaults to "\".
```
5. After completing these steps and making any desired changes, build and run the project:
```bash
npm run build
npm run start
```

61
DOCKER.md Normal file
View file

@ -0,0 +1,61 @@
# Running Tailchat Assistant in Docker
1. Clone the project:
```bash
git clone https://git.povario.com/powermaker450/Tailchat-Assistant && cd Tailchat-Assistant
```
2. Create the `.env` file from the example file:
```bash
mv .env.example .env
```
3. **Make sure that the "Openapi Platform Plugin" and "App Integration" plugins are enabled.** Then [follow the instructions here to create an App.](https://tailchat.msgbyte.com/docs/advanced-usage/openapp/create)
4. Open up the `.env` file with your preffered text editor and populate the file:
**NOTE: Due to the way Docker organizes containers and networks, your `HOST` and `API_ENDPOINT` variables will likely not be `localhost`. Use the container names or IP's instead.**
```bash
HOST=http://tailchat-core:11000
# wherever your Tailchat server is running
ID=
# your bot ID
SECRET=
# Your bot secret
API_ENDPOINT=http://localai:8080/v1
# The OpenAI-compatible endpoint for the bot to send messsages to. Defaults to "http://localhost:8080/v1"
# e.x. "http://localhost:8080/v1", "https://api.openai.com/v1"
API_KEY=none
# Your API key here for OpenAI/LocalAI. Defaults to the string "none".
# MAKE SURE to fill this in with your OpenAI API Key or a key you may have set for LocalAI.
# If you didn't set the API key for LocalAI, you may leave this blank.
TEXT_MODEL=gpt-4
# The model to query when sending text messages. Defaults to "gpt-4"
# e.x. "gpt-3", "gpt-4"
CREATE_IMAGE_MODEL=stablediffusion-cpp
# The model to use when creating images. Defaults to "stablediffusion-cpp",
# e.x. "dall-e-3", "stablediffusion-cpp"
ANALYZE_IMAGE_MODEL=gpt-4-vision-preview
# The model to use when analyzing images. Defaults to "gpt4-vision-preview".
# e.x. "gpt-4-vision-preview", "llava"
ALLOWED_CHAT=
# The ID of the channel that the bot is allowed to respond in.
# The bot will always respond to Direct Messages.
SAFE_WORD=\
# When this character/string is detected anywhere in a message, the bot won't respond to it.
# Defaults to "\".
```
5. After populating the .env file, start the container:
```bash
docker compose up -d
```
Docker will build the container image and start the project. Use `docker compose logs -f` to make sure everything is working and no errors have occured.

View file

@ -28,61 +28,6 @@ Reach out to me at [contact@povario.com](mailto:contact@povario.com) if you enco
--- ---
# Getting started # Getting started
1. Clone the project and install dependencies: **Choose how you want to run the bot:**
```bash - [Normally (with `npm`)](https://git.povario.com/powermaker450/Tailchat-Assistant/src/branch/main/BARE.md)
git clone https://git.povario.com/powermaker450/Tailchat-Assistant && cd Tailchat-Assistant - [Docker](https://git.povario.com/powermaker450/Tailchat-Assistant/src/branch/main/DOCKER.md)
npm i
```
2. Create the `.env` file from the example file:
```bash
mv .env.example .env
```
3. **Make sure that the "Openapi Platform Plugin" and "App Integration" plugins are enabled.** Then [follow the instructions here to create an App.](https://tailchat.msgbyte.com/docs/advanced-usage/openapp/create)
4. Open up the `.env` file with your preffered text editor and populate the file:
```bash
HOST=http://localhost:11000
# wherever your Tailchat server is running
ID=
# your bot ID
SECRET=
# Your bot secret
API_ENDPOINT=http://localhost:8080/v1
# The OpenAI-compatible endpoint for the bot to send messsages to. Defaults to "http://localhost:8080/v1"
# e.x. "http://localhost:8080/v1", "https://api.openai.com/v1"
API_KEY=none
# Your API key here for OpenAI/LocalAI. Defaults to the string "none".
# MAKE SURE to fill this in with your OpenAI API Key or a key you may have set for LocalAI.
# If you didn't set the API key for LocalAI, you may leave this blank.
TEXT_MODEL=gpt-4
# The model to query when sending text messages. Defaults to "gpt-4"
# e.x. "gpt-3", "gpt-4"
CREATE_IMAGE_MODEL=stablediffusion-cpp
# The model to use when creating images. Defaults to "stablediffusion-cpp",
# e.x. "dall-e-3", "stablediffusion-cpp"
ANALYZE_IMAGE_MODEL=gpt-4-vision-preview
# The model to use when analyzing images. Defaults to "gpt4-vision-preview".
# e.x. "gpt-4-vision-preview", "llava"
ALLOWED_CHAT=
# The ID of the channel that the bot is allowed to respond in.
# The bot will always respond to Direct Messages.
SAFE_WORD=\
# When this character/string is detected anywhere in a message, the bot won't respond to it.
# Defaults to "\".
```
5. After completing these steps and making any desired changes, build and run the project:
```bash
npm run build
npm run start
```