English README | 中文说明
Wanny is an all-in-one butler system that connects WeChat, smart devices, and service providers. It supports natural-language control, long-term memory, proactive care, and a unified console across home automation, messaging, and mobility scenarios.
For the overall product introduction, system composition, and architecture overview, start with 2026-03-29-wanny-design.md.
This repository is licensed under Apache License 2.0. Third-party submodules under third_party/ retain their own upstream licenses.
Important Notice
- This project is intended for personal learning, experimentation, and non-commercial use only.
- This project does not provide a hosted SaaS service for now; please deploy it yourself if you want to use it.
- It is under active and rapid iteration, with the goal of exploring broader possibilities for AI in device control, automation, memory, and agent-driven execution.
- Because the project is evolving quickly, there may still be many bugs and rough edges during this stage.
- You are welcome to star the repository, open issues, contribute code, and help connect more physical devices and real-world integrations.
- Please use it with care: AI-driven operations may trigger real commands, affect connected devices or external services, and introduce practical safety risks.
- You are responsible for evaluating whether a given setup, authorization, or action is safe for your environment.
- By using this project, you accept the associated risks; the repository maintainer does not assume liability for loss, damage, service disruption, or unsafe outcomes caused by usage, misconfiguration, automation side effects, or third-party platform changes.
- Messaging entry: WeChat
- Smart home platforms: Mijia, Midea
- Smart appliance platforms: Hisense
- Mobility platforms: Mercedes-Benz
- Automation hubs: Home Assistant
Wanny is not just about controlling a few devices. The bigger idea is an AI butler that understands your rhythm, connects real-world context, and quietly gets things ready before you ask.
Picture the trip home after work:
- Your car leaves the office, and Jarvis uses vehicle location, route context, and expected arrival time to understand that you are heading home.
- Instead of stopping at a reminder, it can start acting in the physical world by turning on the rice cooker so dinner is already underway before you arrive.
- It can also check live weather and indoor or outdoor temperature, then preheat the floor heating if the home is getting too cold, so the space feels right the moment you open the door.
- If fuel is running low, it can combine your current route, nearby station options, and real-world signals such as rising gas prices to proactively recommend a better place to refuel.
The long-term goal is to connect device control, environmental sensing, mobility signals, real-world information, long-term memory, and proactive decision-making into one continuous experience. We want AI to do more than answer questions. We want it to understand context, participate in daily life, and make the right things happen at the right time.
That is also why this project keeps expanding toward more device integrations and more real-world capabilities: every new connection makes Wanny feel less like a rule engine and more like a genuinely helpful companion.
- Sense, decide, execute: Monitor device states in real time, reason over scenes and modes, and drive actions through a unified control layer.
- Behavior learning: Learn recurring habits, convert repeated confirmations into trusted automations, and keep the user in control.
- Permission matrix: Configure whether each action should ask, always allow, or never allow under different scenarios.
- Natural-language interaction: Talk to Wanny through WeChat for everyday questions, device operations, and task execution.
- Simple vs. complex task routing:
- Lightweight requests can be answered directly by the backend prompt flow.
- Multi-step tasks can be expanded into shell-oriented execution flows through the AI agent.
- Manual gate: Any system-level operation must still be explicitly approved by the user over WeChat before execution.
- Safety roadmap: The project is designed to evolve toward stronger isolation with sandboxed and containerized execution.
- Dual-track memory:
- Memory A (semantic vectors) stores dialogue fragments for retrieval and context recall.
- Memory B (structured profile) extracts durable preferences such as temperature habits or entertainment preferences.
- Daily review: Wanny can summarize and refine user understanding over time.
- Proactive suggestions: Environmental events, routines, and contextual signals can trigger timely reminders without becoming spammy.
- Backend: Django 6.0 + Django Channels (WebSocket)
- Database: MySQL + vector storage (ChromaDB/FAISS)
- AI Engine: Gemini CLI (
gemini) - Messaging: WeChat iLink protocol (
wechatbot-sdk) - Device Integrations: Mijia API (
mijiaAPI), Midea cloud, Hisense, Mercedesmbapi2020 - Frontend: Vue 3 + Vite + TypeScript + Tailwind CSS + shadcn-vue + vue-i18n
/wanny
/backend # Django workspace
/apps
/brain # LLM agent logic and decision hub
/comms # WeChat gateway and routing
/database # Core ORM models and storage
/memory # Long-term memory and profile extraction
/devices # Device aggregation and control
/providers # Third-party provider integrations
/frontend # Vue 3 frontend workspace (Landing Page + Jarvis Console)
/docs # Design specs, implementation notes, and plans
/third_party # Third-party source references, migration aids, acknowledgements
- Docker deployment guide: docs/DEPLOYMENT.md.
- The system overview lives in 2026-03-29-wanny-design.md.
- The proactive care design and implementation status lives in 2026-04-03-proactive-care-design.md.
- The roadmap lives in docs/ROADMAP.md.
- Detailed specifications and implementation notes live under
docs/superpowers/specs. - Chinese documentation is available in README.zh-CN.md.
- Connect at least one device platform and, if needed, a Home Assistant account.
- Open the console and go to
Care. - Add a weather source in
Care Rules. - Review system rules or create a custom rule with the condition builder.
- Run
Scan nowor refresh weather to generate suggestions. - Approve, reject, ignore, or execute suggestions from the care center.
For the current backend/frontend scope, scheduler behavior, and remaining gaps, see 2026-04-03-proactive-care-design.md.
frontend/contains the public landing page and the console UI.- The current frontend stack is Vue 3 + Vite + TypeScript + Tailwind CSS + shadcn-vue + vue-i18n.
- For frontend-specific run instructions, see
frontend/README.md.
- Privacy first: Sensitive operations remain gated and local-first where possible.
- Execution filters: Dangerous commands such as
sudoor destructive file removal should be blocked by policy. - AI rule chain: Root-level
README.mdandREADME.zh-CN.mdexplain the project, whileGEMINI.md,backend/GEMINI.md, andfrontend/GEMINI.mddefine operational AI rules.
third_party/midea_auto_cloudis included as a Git submodule and used as a reference implementation for Midea integration work. Upstream license: Apache License 2.0.third_party/mbapi2020is included as a Git submodule and used as a reference implementation for Mercedes integration work. Upstream license: MIT License.third_party/HisenseHAis included as a Git submodule and used as a reference implementation for Hisense integration work. Upstream license: MIT License.- These repositories are used for protocol study, field mapping, and migration reference rather than as runtime dependencies of Wanny.
- Thanks to
sususweet/midea_auto_cloudfor the upstream Home Assistant integration and implementation ideas. - Thanks to
ReneNulschDE/mbapi2020for the upstream Home Assistant integration and implementation ideas. - Thanks to
manymuch/HisenseHAfor the upstream Home Assistant integration and implementation ideas.
- This project may contain third-party dependencies, third-party reference implementations, or compatibility code derived from public information and community projects. Intellectual property, licensing, and compliance boundaries remain subject to the original upstream terms.
- Some integrations may rely on unofficial interfaces, undocumented APIs, reverse engineering, packet inspection, or compatibility adaptations. That does not imply official support from the related platforms and does not guarantee long-term stability.
- If you connect third-party accounts, devices, apps, or cloud services, evaluate the risks yourself, including API changes, rate limiting, account controls, service policy restrictions, and regional differences.
- The project should be treated as an experimental system for research, learning, personal integration, and compatibility exploration unless you have separately validated legal, operational, and security requirements for production use.
- Because Wanny can control real devices and interact with external services, use extra caution around powered devices, heating equipment, locks, unattended automations, and safety-critical scenarios.
- Users remain responsible for deciding whether each authorization, automation, or control instruction is safe for the current environment.
- AI agents should first read root-level
README.mdorREADME.zh-CN.mdto understand the project context, repository layout, and workspace boundaries. - Shared AI rules live in root-level
GEMINI.md. - Backend-specific AI rules live in
backend/GEMINI.md. - Frontend-specific AI rules live in
frontend/GEMINI.md. - Recommended reading order:
README.mdREADME.zh-CN.mdGEMINI.mdbackend/GEMINI.mdorfrontend/GEMINI.md
GEMINI.md is for AI behavior rules rather than product introduction. Project background, structure, and run instructions should remain in the relevant README files.
- The backend exposes
uv run python manage.py audit_midea_mapping --limit 80to scan for high-risk Midea mapping issues. - This helps catch raw labels, untranslated option text, and low-level state leakage before they regress into the device UI.

