Start your LLM journey with Llama V2 7B. Launching soon in September!
* Features client side chat persistence
* Streaming output to avoid waiting for output completion
* No login needed
* Made possible due to Llama.cpp project by ggerganov