Tencent/ncnn
Tencent/ncnn
Tencent's open-source mobile AI inference framework, runs deep learning models offline on phones with high speed and no dependencies.
ncnn is a high-performance neural network inference framework optimized for the mobile platform
AI Summary
What This Project Does
Simply put, it's an engine that allows phones to run AI models. Think of it as an accelerator specifically for deep learning on your device.
What Problems It Solves
Solves the issues of AI being too slow or laggy on phones, and the need for internet connectivity. Previously, many AI features relied on servers; now it runs locally on the device, protecting privacy and saving data.
Who It's For
Ideal for Android/iOS programmers building smart apps, or tech leads looking to deploy AI technology into mobile products.
Typical Use Cases
1. Face recognition login without uploading photos to the cloud.
2. Real-time photo beauty filters that don't lag.
3. Voice recognition assistants that understand commands offline.
4. Smart album sorting, automatically identifying faces and objects in photos.
Key Strengths & Highlights
Tested and used by Tencent in WeChat and QQ. It's faster than other known open-source frameworks on mobile devices, requires no extra dependency libraries, and keeps app size small.
Getting Started Requirements
Requires some C++ programming knowledge or familiarity with model conversion tools. If you are a pure Python user, you may need to convert model formats first; not suitable for complete coding beginners.
Purpose
Suitable for developers who want to put AI features into apps, as deployment is simple and fast. But not suitable for those who just want to train models, as it handles inference only.
Category
Tech Stack
Project Info
- Primary Language
- C++
- Default Branch
- master
- License
- NOASSERTION
- Homepage
- â
- Created
- Jun 30, 2017
- Last Commit
- yesterday
- Last Push
- yesterday
- Indexed
- Apr 19, 2026