Running a Local Model with Ollama and Building an End-to-End Application
I found out that running a ChatGPT API once for your project costs around 30 cents and it charges the same everytime you make calls to the model. It isn’t much if you’re only doing this for trial or to experiment. But in comparison to models like ChatGPT (free or premium version), the alternative local models are a good option. They are not as big (in terms of parameters) but for your day-to-day applications, you can easily use them with little customisations and adjustments. Here I am sharing how I made my own custom local model and the prompts and responses related to it.
Vaibhav Gandotra
12/23/20241 min read

