Skip to content

feat: add usage statistics visualization (closes #13)#55

Open
Neo2025new wants to merge 3 commits intoErlichLiu:mainfrom
Neo2025new:feature/usage-statistics
Open

feat: add usage statistics visualization (closes #13)#55
Neo2025new wants to merge 3 commits intoErlichLiu:mainfrom
Neo2025new:feature/usage-statistics

Conversation

@Neo2025new
Copy link

@Neo2025new Neo2025new commented Feb 28, 2026

Summary

Closes #13

Complete implementation of token usage statistics tracking and visualization, split into 3 logical commits:

Commit 1: Token Capture Layer

Capture token usage from all provider SSE streaming responses:

  • Anthropic: parse message_start.usage + message_delta.usage
  • OpenAI/DeepSeek/Custom: enable stream_options, parse final usage chunk
  • Google: parse usageMetadata from stream response
  • Accumulate usage across tool-use rounds in chat-service, persist to ChatMessage.usage

Commit 2: Storage Service + IPC

Data persistence and IPC layer for usage tracking:

  • UsageRecord / UsageSummary types + USAGE_IPC_CHANNELS constants
  • usage-stats-service.ts: read/write ~/.proma/usage-stats.json with daily + per-model aggregation
  • Register IPC handlers, expose APIs through preload bridge
  • Call recordUsage() after message persistence in chat-service

Commit 3: UI Visualization

Settings panel with recharts dashboard:

  • Summary cards: today / 30-day token totals
  • 30-day trend AreaChart (input/output tokens with gradients)
  • Per-model usage distribution list
  • Empty state placeholder
  • Register "用量" tab in SettingsPanel navigation

Files Changed (16 files)

packages/core/ (4 files)

  • providers/types.tsStreamUsageEvent type
  • providers/sse-reader.tsTokenUsage interface, accumulation in StreamSSEResult
  • providers/anthropic-adapter.ts — Anthropic usage parsing
  • providers/openai-adapter.ts — OpenAI usage parsing
  • providers/google-adapter.ts — Google usage parsing

packages/shared/ (2 files)

  • types/usage.ts — Usage types and IPC channel constants
  • types/chat.tsMessageUsage on ChatMessage

apps/electron/ (10 files)

  • main/lib/usage-stats-service.ts — Storage service (new)
  • main/lib/config-paths.tsgetUsageStatsPath()
  • main/lib/chat-service.ts — Usage accumulation + recordUsage() call
  • main/ipc.ts — IPC handler registration
  • preload/index.ts — Preload bridge APIs
  • renderer/atoms/usage-atoms.ts — Jotai atoms (new)
  • renderer/atoms/settings-tab.ts — Add 'usage' tab type
  • renderer/components/settings/UsageSettings.tsx — Dashboard UI (new)
  • renderer/components/settings/SettingsPanel.tsx — Tab registration
  • package.json — recharts dependency

Design Decisions

  • Additive token accumulation works correctly across all 3 providers (Anthropic sends split events, OpenAI/Google send totals once)
  • Optional usage? field on ChatMessage — fully backward compatible with existing JSONL data
  • 4-layer IPC pattern followed: shared types → main handler → preload bridge → renderer atoms
  • SettingsSection/SettingsCard primitives — consistent visual style with existing settings pages
  • recharts AreaChart — lightweight, React-native charting with HSL theme integration

Test plan

  • bun run typecheck passes across all packages ✅
  • Test with Anthropic provider — usage events captured from message_start / message_delta
  • Test with OpenAI-compatible provider — stream_options works, usage returned
  • Test with Google Gemini — usageMetadata parsed correctly
  • Verify chat functionality unaffected (backward compatible)
  • Open Settings → 用量 tab — empty state shown initially
  • After conversations, verify stats populate correctly
  • Verify 30-day trend chart renders with correct data
  • Verify per-model distribution shows all used models
  • Test "清除" button clears all usage data

🤖 Generated with Claude Code

Neo2025new and others added 3 commits March 1, 2026 01:32
Add StreamUsageEvent to the event system and implement token usage
extraction across all three provider adapters:

- Anthropic: parse message_start (input_tokens) + message_delta (output_tokens)
- OpenAI: enable stream_options.include_usage, parse final usage chunk
- Google: parse usageMetadata from streaming response

Token usage is accumulated across tool-use rounds in chat-service and
persisted as an optional `usage` field on ChatMessage. This lays the
groundwork for usage statistics visualization (Issue ErlichLiu#13).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add the data persistence and IPC layer for token usage tracking:

- New UsageRecord/UsageSummary types and USAGE_IPC_CHANNELS constants
- usage-stats-service.ts: read/write ~/.proma/usage-stats.json with
  daily and per-model aggregation queries
- Register IPC handlers for getUsageSummary and clearUsageStats
- Expose APIs through preload bridge
- Call recordUsage() in chat-service after message persistence

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add UsageSettings component with token consumption dashboard:
- Today/30-day summary cards with formatted token counts
- 30-day trend AreaChart (input/output tokens)
- Per-model usage distribution list
- Empty state placeholder
- Register usage tab in SettingsPanel navigation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@Neo2025new Neo2025new changed the title feat: capture token usage from provider SSE responses feat: add usage statistics visualization (closes #13) Feb 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature request: Visualize usage statistics and recent conversation token consumption

1 participant