alice

package module
v0.6.128 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 7, 2026 License: MIT Imports: 3 Imported by: 0

README

Alice

AI Local Interactive Cross-device Engine

Same AI session, anywhere. Terminal ↔ Feishu. No cloud lock-in. Works with OpenCode (DeepSeek V4), Codex, Claude, Gemini, Kimi.

Dev CI Main Release Go Report Card Go Reference

中文

  • Access your agent from anywhere. Terminal at your desk. Feishu on your phone. Same session, same context — just /session resume.
  • Pick your AI. OpenCode / DeepSeek V4, Codex, Claude, Gemini, Kimi. Mix and match per scene.
  • Zero cloud dependency. The agent CLI runs on your machine. No API keys, no vendor lock-in.
  • Goal mode × DeepSeek = low cost. Fire off dozens of tasks for pennies. Get notified on your phone when done.

A Feishu long-connection connector for CLI-based LLM agents — OpenCode (DeepSeek V4), Codex, Claude, Gemini, Kimi.

Runs as a local multi-bot runtime: receives Feishu messages over WebSocket, routes them into chat or work scenes, calls the configured LLM CLI, and sends replies, files, and images back. Zero cloud dependency — everything runs on your machine.

Documentation

Full documentation is at alice-space.github.io/alice.

Tutorials Get Alice running in 5 minutes
How-To Guides Task-focused recipes
Configuration Reference Every config key documented
Architecture Code-level architecture

中文文档 »

Quick Start

npm install -g @alice_space/alice
alice setup
# edit ~/.alice/config.yaml
alice --feishu-websocket

Then in Feishu: @Alice #work deploy the staging environment — Alice creates a task thread, runs your LLM backend, and streams progress back. Use /session anytime to resume the task from your terminal.

Development

make check   # fmt, vet, test, race
make build
make run

Contribution guide: CONTRIBUTING.md

License

MIT

Documentation

Index

Constants

This section is empty.

Variables

View Source
var ConfigExampleYAML []byte
View Source
var OpenCodePluginJS []byte
View Source
var PromptFS fs.FS
View Source
var SkillsFS fs.FS
View Source
var SoulExampleMarkdown []byte
View Source
var SystemdUnitTmpl []byte

Functions

This section is empty.

Types

This section is empty.

Directories

Path Synopsis
cmd
connector command
internal
llm
Package llm provides a unified interface for running LLM agent CLIs (claude, codex, gemini, kimi, etc.) as subprocess backends.
Package llm provides a unified interface for running LLM agent CLIs (claude, codex, gemini, kimi, etc.) as subprocess backends.
llm/internal/shared
Package shared provides utilities reused across LLM provider packages (MergeEnv, EnvKey, ExtractString) and scanner buffer constants.
Package shared provides utilities reused across LLM provider packages (MergeEnv, EnvKey, ExtractString) and scanner buffer constants.
llm/providers/claude
Package claude drives the claude CLI as a subprocess and parses its stream-json output into a plain text reply.
Package claude drives the claude CLI as a subprocess and parses its stream-json output into a plain text reply.
llm/providers/codex
Package codex drives the codex CLI as a subprocess and parses its JSON-lines output into a plain text reply with optional file-change events.
Package codex drives the codex CLI as a subprocess and parses its JSON-lines output into a plain text reply with optional file-change events.
llm/providers/kimi
Package kimi drives the kimi CLI as a subprocess and parses its stream-json output into a plain text reply.
Package kimi drives the kimi CLI as a subprocess and parses its stream-json output into a plain text reply.
llm/providers/opencode
Package opencode drives the opencode CLI as a subprocess and parses its JSON-lines output into a plain text reply, session ID, and token usage.
Package opencode drives the opencode CLI as a subprocess and parses its JSON-lines output into a plain text reply, session ID, and token usage.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL