Padauk — Burmese-First Agentic AI Assistant
Padauk is built to act like a practical daily companion, not just a text generator: based on Gemma 4 and specialized with a custom xLAM-format dataset, it understands complex Burmese context and intent, chooses tools when needed, and stays useful on low-resource devices through GGUF, Ollama, and OpenAI-compatible local APIs.
What verifies Padauk as a Burmese-first assistant
Internal project pages stay first. External artifacts appear only when they are the actual proof source for the assistant release.
| Claim | Source | Why it matters |
|---|---|---|
| Internal lineage | Burmese GPT | Shows the Burmese language foundation that Padauk builds on. |
| Related code model | Burmese-Coder-4B | Connects the assistant layer to the broader Burmese model ecosystem. |
| Primary release | WYNN747/padauk-burmese-agentic-llm | Primary Ollama-ready artifact for local assistant deployment. |
| Original fine-tuned weights | WYNN747/Burmese-GPT-Padauk | Preserves the source weights behind the Padauk specialization. |
Why Padauk is different
Padauk is a Burmese-first agentic assistant. The center of gravity is the user’s language, the user’s intent, and the user’s daily task list.
Padauk is based on Gemma 4, but the value is in the specialization layer: a custom xLAM-format dataset shapes Padauk toward complex Burmese intent understanding, agentic tool use, and coherent multi-step assistance.
Padauk builds on the Burmese language foundation established by Burmese GPT and can optionally hand structured context to another LLM agent when a different capability is the better fit.
Burmese-first
Padauk is designed around Myanmar language use, so the assistant feels native rather than translated.
Agentic by design
It is meant to do more than complete text, with workflows that can decide when search, tools, or follow-up steps help.
Handoff-ready
It keeps the conversation coherent across steps and can pass work to another LLM agent when that improves the outcome.
Low-resource deployment that travels with the user
Padauk is designed to run where users actually have access: laptops, mini PCs, edge boxes, and other constrained hardware. The goal is not cloud dependency. It is practical availability, privacy, and lower operating cost.
GGUF makes quantized models portable. Ollama makes serving and local experimentation simple. OpenAI-compatible local APIs make Padauk easier to wire into existing apps, agents, and automation scripts without forcing a bespoke integration path.
Hugging Face artifacts
The deployable Padauk stack is documented through two concrete artifacts: the primary Ollama-ready release for local serving and the original fine-tuned weights that carry the Gemma 4 plus custom xLAM adaptation.
Primary Ollama release
WYNN747/padauk-burmese-agentic-llm is the main Hugging Face artifact for local Ollama-style deployment and day-to-day runtime use.
Original fine-tuned weights
WYNN747/Burmese-GPT-Padauk contains the original fine-tuned weights behind Padauk’s Gemma 4 and custom xLAM specialization.
GGUF portability
Quantized inference keeps model files practical for small machines and faster local iteration.
Ollama serving
A familiar local runtime helps Padauk behave like a usable assistant rather than a one-off demo.
OpenAI-compatible APIs
Existing app and agent integrations can connect without reinventing the interface layer.
Low-resource fit
The assistant is shaped for real-world constraints, including modest memory, local privacy, and unstable connectivity.
Practical daily automation and companion-style assistance
Padauk is meant to be useful on ordinary days. It can help draft messages, summarize notes, translate between Burmese and English, explain instructions, and prepare checklists or briefings.
Companion-style help
Use Padauk as a conversational partner for planning, learning, and everyday decisions without switching tone or context.
Tool-aware workflows
Let the assistant search, summarize, use tools, or trigger a handoff when a plain answer is not enough.
Daily automation
Turn routine tasks into short interactions: rewrite a note, produce a summary, draft a reply, or prepare a next step list.
When the request needs more structure, Padauk can act like an orchestrator: it can search, reason, and then hand work off to another LLM agent or specialist workflow while keeping the conversation in Burmese.
Ecosystem / Package Links
Live Model Arena
Type in Burmese or English. The assistant responds in Burmese language.
Frequently Asked Questions
What is Padauk?
Padauk (ပိတောက်) is a Burmese-first agentic AI assistant built for everyday use. It is designed for practical daily automation, companion-style assistance, and tool-aware workflows rather than only text generation.
Is Padauk just another text-generation model?
No. Padauk is positioned as an assistant product, not only a text-generation or fine-tuning model. It is meant to answer naturally in Burmese, choose tools when useful, and support real day-to-day work.
What makes Padauk agentic?
Padauk is agentic because it is designed to do more than chat. It can support planning, search, rewriting, summarizing, and tool use so the system can help complete tasks instead of only completing text.
What base model is Padauk built on?
Padauk is based on Gemma 4 and then specialized with a custom xLAM-format dataset so it can better understand complex Burmese intent and perform agentic tool use in practical workflows.
Can Padauk run on low-resource devices?
Yes. Padauk is designed with low-resource deployment in mind so it can work well on laptops, mini PCs, and other constrained environments. The focus is portability, privacy, and practical access instead of requiring a large cloud-only stack.
Does Padauk support GGUF, Ollama, and OpenAI-compatible APIs?
Yes. The project is framed around common local deployment paths such as GGUF for portable quantized inference, Ollama for local serving, and OpenAI-compatible APIs for easier app and agent integration.
Which Hugging Face release should I use for Padauk?
Use WYNN747/padauk-burmese-agentic-llm as the primary Ollama-ready release for local deployment. Use WYNN747/Burmese-GPT-Padauk when you want the original fine-tuned weights behind the Padauk adaptation.
How is Padauk related to Burmese GPT?
Padauk builds on the Burmese language foundation established by Burmese GPT research. The project page for that foundation is available at /projects/burmese-gpt, while Padauk turns that research into a practical assistant for daily use.
Can Padauk hand off work to another LLM agent?
Yes. Padauk can pass structured context, user intent, and task state to another LLM agent or specialist workflow when that is the better fit. That keeps Padauk focused on the user-facing conversation while another agent handles the specialized step.
What can I use Padauk for every day?
Padauk can help with drafting messages, translating Burmese and English, summarizing notes, explaining concepts, preparing checklists, and handling routine daily automation in a companion-style flow.
What language does Padauk use?
Padauk is Burmese-first. You can ask in Burmese or English, and the experience is designed to respond naturally in Myanmar language while staying useful for mixed-language workflows.
Is the live arena free to use?
Yes. The Padauk Model Arena on this page is free to use and does not require an account. It is available for live testing directly in the browser.