[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
-
Updated
Aug 12, 2024 - Python
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support Llama-3/3.1, Llama-2, LLaMA, BLOOM, Vicuna, Baichuan, TinyLlama, etc.
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
kani (カニ) is a highly hackable microframework for chat-based language models with tool use/function calling. (NLP-OSS @ EMNLP 2023)
improve Llama-2's proficiency in comprehension, generation, and translation of Chinese.
📚 Local PDF-Integrated Chat Bot: Secure Conversations and Document Assistance with LLM-Powered Privacy
Chat to LLaMa 2 that also provides responses with reference documents over vector database. Locally available model using GPTQ 4bit quantization.
An Offline Document Enquiry LLM for Everyone
LLM Security Project with Llama Guard
This project streamlines the fine-tuning process, enabling you to leverage Llama-2's capabilities for your own projects.
MLX Institute | Fine-tuning Llama-2 7B on The Onion to generate new satirical articles given a headline
AI-powered email management system that automates email categorization, sentiment analysis, summarization, and response generation using NLP models and machine learning, enhancing communication efficiency.
The lexical simplification baseline for MLSP Shared Task at BEA 2024
Add a description, image, and links to the llama-2 topic page so that developers can more easily learn about it.
To associate your repository with the llama-2 topic, visit your repo's landing page and select "manage topics."