Skip to content

pratiksatpai2013/Loca_chat_app_sample

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 

Repository files navigation

Local_chat_app_sample

🧠 Ollama LLM Chat API with FastAPI (Google Colab Edition)

This project demonstrates how to pull an LLM model using Ollama and serve it as a chat API using FastAPI, all inside a Google Colab environment.

πŸ“¦ Tech Stack

  • πŸ¦™ Ollama (for running local LLMs)
  • ⚑ FastAPI (for exposing the chat interface as an API)
  • πŸš€ Uvicorn (ASGI server to run FastAPI)
  • πŸ§ͺ Google Colab (as the environment to run everything)
  • 🧠 Langchain, Langgraph

πŸ’‘ Features

βœ… Pull and run an LLM model (like llama2, mistral, or phi) using Ollama
βœ… Expose the model via a simple FastAPI endpoint
βœ… Send chat messages and get model-generated responses
βœ… Designed to run entirely on Google Colab for quick testing

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors