Use the Ollama-node library for HTTP requests to the Ollama server we setup previously
Locally deploy Ollama to interact with LLMs via API.