Just successfully connected the latest GLM-5 (via the GLM Coding Plan/PaaS API) to OpenClaw using the CLI. The context window is huge, and it works perfectly with the custom endpoint.
If you want to set this up, here is the exact command I used.
The Configuration:
Note that the baseUrl targets the specific coding PaaS endpoint, and the API type is set to openai-completions.
openclaw config set models.providers.glm_coding '{
"baseUrl": "https://open.bigmodel.cn/api/coding/paas/v4",
"apiKey": "YOUR_API_KEY_HERE",
"api": "openai-completions",
"models": [
{
"id": "glm-5",
"name": "GLM-5 Coding",
"contextWindow": 200000
}
]
}'
After running the config:
- Set it as your active model:
openclaw config set agents.defaults.model.primary "glm_coding/glm-5" - Restart the gateway:
openclaw gateway restart
I’ve verified this locally and it’s running smooth. Happy coding! 💻✨
#OpenClaw #GLM5 #AI #Coding #DevTools


Great tutorial! Thanks for sharing the exact config. I was struggling with the endpoint setup - turns out I was using the wrong baseUrl. The 200k context window on GLM-5 is impressive for coding tasks. 🚀