Introduction

In the wave of artificial intelligence, Langzhou Technology, with its innovative spirit, continues to promote the sustainable development of the AI ecosystem. Following the open source of the Mengzi 3-13B large model, Langzhou Technology has once again made a strong push, launching a lightweight large model — Mengzi 3-8B, and announced its open source, while also supporting free commercial use. This move undoubtedly injects new vitality into the AI field and brings good news to individual developers and AI enthusiasts.

Mengzi 3-8B: Lightweight, High Performance

Mengzi 3-8B demonstrates performance comparable to Mengzi 3-13B in multiple application scenarios, including writing, code, summarization, and reading comprehension. The model's parameter scale is more streamlined, making it particularly suitable for individual developers and AI enthusiasts.

Core Advantages

  • Low VRAM High Inference: Mengzi 3-8B is an efficient AI large model with low VRAM requirements, occupying less than 16G of VRAM at half precision, suitable for PCs and most consumer-grade graphics cards.
  • Outstanding Knowledge Processing: It significantly surpasses open source models of the same size in knowledge processing and problem-solving capabilities in multiple fields such as Chinese language skills, English, world knowledge (MMLU), programming, and mathematics.

User Guide

The open source repository of Mengzi 3-8B provides rich information and download methods, allowing developers and enthusiasts to get started easily.

Open Source Address

Quick Start

```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("Langboat/Mengzi3-8B-Base", use_fast=False, trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("Langboat/Mengzi3-8B-Base", device_map="auto", trust_remote_code=True) inputs = tokenizer('Input: Introduce Mencius. Output:', return_tensors='pt')

if torch.cuda.is_available(): inputs = inputs.to('cuda')

pred = model.generate(**inputs, max_new_tokens=512, repetition_penalty=1.01, eos_token_id=tokenizer.eos_token_id) print(tokenizer.decode(pred[0], skip_special_tokens=True)) ```

Mengzi 3-8B: A New Driving Force for AI Innovation

Langzhou Technology's Mengzi 3-8B model, with its lightweight and high-performance characteristics, provides a new driving force for AI innovation. Both individual developers and enterprise users can benefit from it, promoting the application of business scenarios and the development of the digital economy.

Conclusion

Langzhou Technology's open source initiative not only provides strong support for the development of AI technology but also makes an important contribution to the improvement of the open source ecosystem. We look forward to working with more AI developers and enthusiasts to jointly meet the challenges and opportunities brought by AI technology.


Scan to join the Mengzi Open Source Community WeChat Group
Langzhou Technology Official Website
Awards and News
- Langzhou Technology approved as Beijing's Specialized, Fined and New and Specialized Small and Medium Enterprises
- Langzhou Technology completes the "Trustworthy AI" assessment by the China Academy of Information and Communications Technology
- Langzhou Technology completes Pre-A+ round of financing
- HICOOL 2021 Entrepreneurship Competition First Prize | Interview with Dr. Zhou Ming

Latest Cooperation
- Join hands with Xinhua Zhiyun | Join hands with China Unicom
- Join hands with Huaxia Fund | Join hands with Tonghuashun
- Join hands with Chinese Online | Join hands with Data Story

Technical Column Overview
- Assisted Creation Engine | Search Engine Technology
- Automatic Summary Generation | Machine Translation Technology
- Mengzi Model Open Source | Mengzi Technology Interpretation
- Cognitive Intelligence Platform | Panda Novelist

We look forward to your attention and participation!