substrate-cli runs on 3 micro-services, api-server, consumer-service and llm node. Along with these, redis and rabbitmq are utilised for communication, so a total of 5 services. All of them enpacked in docker-compose.
Below are the directions to 3 micro-services listed above :
- api-server(goLang), entry point to substrate cli https://github.com/substrate-cli/api-server
- consumer-service(goLang), depending on the prompt, pickup which logic to follow https://github.com/substrate-cli/consumer-service-cli
- llm-node(nodeJS), calls llm apis to get the response based on user prompt https://github.com/substrate-cli/llm-node-cli
Pre-Setup. All of the services can cloned and spinned up locally, but one needs to install rabbitmq and redis locally to get all the services running.
- Rabbit mq
on windows – can be installed using chocolatey or windows installer, https://www.rabbitmq.com/docs/install-windows
on Mac – can be installed using home-brew, but make sure to have home-brew installed first https://www.rabbitmq.com/docs/install-homebrew - Redis
on windows – can be installed using a curl command in PowerShell,
https://redis.io/docs/latest/operate/oss_and_stack/install/archive/install-redis/install-redis-on-windows/
on Mac – can be installed using home-brew, but make sure to have home-brew installed first
https://redis.io/docs/latest/operate/oss_and_stack/install/archive/install-redis/install-redis-on-mac-os/ - Additional
installing home-brew on Mac, https://brew.sh
once installed, verify by checking the versions for redis and rabbit.
Start redis and rabbitmq in local.
- update the environment variables in .env for all services, with the correct api keys. drop your api key for the preferred model. All of the envs are listed in the readmes of repos.
- once both redis and rabbitmq are started, open your preferred terminal and cd into each micro service directory so that you now have 3 terminals opened. start by running llm node.
- llm-node, npm run dev
- consumer-service, go run ./cmd/app
- api-server, go run ./cmd/app
CLI Underneath.
api server – comes with endpoints, entry point to cli, supports 2 modes, “cli” and “server” (to be mentioned in environment variables). Api server is also responsible for receiving live updates about the current processing tasks. Api server also supports web socket connections for the live updates.
consumer service – this server does not have any endpoints, utilises go routines to handle the processing tasks. only communicates via a rabbit mq channel with apiserver/llm-node. depending upon if the input prompt is a url clone, GitHub clone, prompt to UI and prompt to full stack gen, consumer service handles a user request. Post generation, consumer service will start the generated cluster as a docker container and will be made accessible to the host os. user can then download the project via download.bat(for windows) or download.command(for Mac).
llm-node – does not have any endpoints, only communicates via a rabbit mq channel. responsible for long running tasks. sends back the response to consumer service on completion. llm-node is only and only accessible to consumer-service.
substrate is open source and free for public use.