Skip to content

feat(kernel-language-model-service): Add language model client#876

Open
grypez wants to merge 2 commits intomainfrom
grypez/language-model-client
Open

feat(kernel-language-model-service): Add language model client#876
grypez wants to merge 2 commits intomainfrom
grypez/language-model-client

Conversation

@grypez
Copy link
Contributor

@grypez grypez commented Mar 16, 2026

feat(kernel-language-model-service): open v1 backend and vat clients

Kernel service (src/kernel-service.ts): makeKernelLanguageModelService wraps a chat (and optional sample) function into a flat, hardened object suitable for kernel.registerKernelServiceObject(). Return values are plain data — no exos — so they cross the marshal boundary cleanly. Exported as LANGUAGE_MODEL_SERVICE_NAME = 'languageModelService'.

Open /v1 backend (src/open-v1/): OpenV1BaseService posts to /v1/chat/completions with injected fetch, supporting both single-response and SSE streaming modes. makeOpenV1NodejsService is the Node.js entry point (accepts baseUrl and optional apiKey).

Ollama backend (src/ollama/): OllamaNodejsService extends the Open /v1 base, defaulting to http://localhost:11434 and adding a makeInstance({ model }) factory.

Vat clients (src/client.ts): makeChatClient returns an OpenAI-SDK-shaped object (client.chat.completions.create(...)) that routes calls through a CapTP ChatService reference via E(). makeSampleClient is the equivalent for raw token sampling.

Test utilities (src/test-utils/): Replaces the previous queue-based mock model with makeMockOpenV1Fetch — a deterministic SSE mock that emits a configured sequence of token strings, keeping integration tests CI-safe and fast. makeMockSample provides the same for the sample path.

Also updates kernel-test to use the new chat/sample API: replaces lms-user-vat / lms-queue-vat with lms-chat-vat / lms-sample-vat and the corresponding test files.

test(kernel-test-local): kernel-LMS integration tests

Adds two test variants that share a common runLmsChatKernelTest helper:

  • lms-chat.test.ts (CI): injects makeMockOpenV1Fetch — no network, runs under test:dev.
  • lms-chat.e2e.test.ts (local): uses real fetch against a local Ollama instance, run via test:e2e:local.

Both launch a subcluster with lms-chat-vat through the kernel, register the LMS kernel service, and assert that the vat logs lms-chat response: Hello (or equivalent).

Also adds agents.e2e.test.ts (moved from the old layout) exercising json/repl agents end-to-end, and test/suite.test.ts as an Ollama pre-flight check.


Note

Medium Risk
Adds new kernel-exposed service surfaces (chat/sample) plus an OpenAI-compatible HTTP backend with SSE parsing; correctness and type-safety around streaming/non-streaming behavior and marshal-safe hardening are the main risks. Also updates multiple integration tests and local e2e wiring, which may affect CI/local test stability.

Overview
Adds a new language-model service API suitable for kernel registration and vat consumption. makeKernelLanguageModelService exposes a hardened { chat, sample } service under the canonical LANGUAGE_MODEL_SERVICE_NAME, and new vat-side helpers makeChatClient/makeSampleClient provide OpenAI-style ergonomics while enforcing model selection.

Introduces new backends and expands Ollama support. Adds an Open /v1/chat/completions Node.js backend (makeOpenV1NodejsService) with optional API key and SSE streaming support, and extends the Ollama service to support chat and raw sample (generate with raw: true) plus a makeOllamaNodejsKernelService adapter.

Reworks tests and test utilities around the new API. Replaces the queue-based test LMS with makeMockOpenV1Fetch and makeMockSample, adds unit coverage for the new clients/kernel service/Open v1 streaming, and updates kernel-test and kernel-test-local to run new chat/sample vat round-trip tests (including a local Ollama e2e path and updated build/runner config).

Written by Cursor Bugbot for commit ea9f656. This will update automatically on new commits. Configure here.

@grypez grypez force-pushed the grypez/language-model-client branch from f2a4ffd to e3cf0b3 Compare March 16, 2026 14:26
@github-actions
Copy link
Contributor

github-actions bot commented Mar 16, 2026

Coverage Report

Status Category Percentage Covered / Total
🔵 Lines 77.16%
⬇️ -0.10%
7822 / 10137
🔵 Statements 76.96%
⬇️ -0.11%
7945 / 10323
🔵 Functions 75.1%
⬇️ -0.17%
1880 / 2503
🔵 Branches 73.85%
⬇️ -0.80%
3186 / 4314
File Coverage
File Stmts Branches Functions Lines Uncovered Lines
Changed Files
packages/kernel-language-model-service/src/client.ts 100% 100% 100% 100%
packages/kernel-language-model-service/src/index.ts 100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
packages/kernel-language-model-service/src/kernel-service.ts 100% 100% 100% 100%
packages/kernel-language-model-service/src/types.ts 100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
packages/kernel-language-model-service/src/ollama/base.ts 56.25%
⬇️ -43.75%
10.52%
⬇️ -89.48%
81.81%
⬇️ -18.19%
56.25%
⬇️ -43.75%
63-135
packages/kernel-language-model-service/src/ollama/nodejs.ts 63.63%
⬇️ -36.37%
100%
🟰 ±0%
40%
⬇️ -60.00%
63.63%
⬇️ -36.37%
57-60
packages/kernel-language-model-service/src/ollama/types.ts 100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
packages/kernel-language-model-service/src/open-v1/base.ts 94.28% 75% 100% 94.28% 118-119
packages/kernel-language-model-service/src/open-v1/nodejs.ts 83.33% 50% 100% 83.33% 30
packages/kernel-language-model-service/src/open-v1/types.ts 100% 100% 100% 100%
packages/kernel-language-model-service/src/test-utils/index.ts 100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
packages/kernel-language-model-service/src/test-utils/mock-fetch.ts 0% 0% 0% 0% 9-29
packages/kernel-language-model-service/src/test-utils/mock-sample.ts 0% 0% 0% 0% 9-16
packages/kernel-test-local/src/constants.ts 83.33%
🟰 ±0%
100%
🟰 ±0%
0%
🟰 ±0%
100%
🟰 ±0%
18
packages/kernel-test-local/src/lms-chat.ts 100% 100% 100% 100%
packages/kernel-test-local/src/utils.ts 100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
100%
🟰 ±0%
packages/kernel-test-local/src/vats/lms-chat-vat.ts 0% 0% 0% 0% 22-38
packages/kernel-test/src/vats/lms-chat-vat.ts 0% 0% 0% 0% 22-40
packages/kernel-test/src/vats/lms-sample-vat.ts 0% 0% 0% 0% 24-32
Generated in workflow #3944 for commit ea9f656 by the Vitest Coverage Report Action

@grypez grypez force-pushed the grypez/language-model-client branch from e3cf0b3 to 5296cb9 Compare March 16, 2026 16:17
@grypez grypez marked this pull request as ready for review March 16, 2026 16:32
@grypez grypez requested a review from a team as a code owner March 16, 2026 16:32
@grypez
Copy link
Contributor Author

grypez commented Mar 16, 2026

@cursoragent review this PR for inconsistencies across docstrings, documentation and naming conventions

@cursor
Copy link

cursor bot commented Mar 16, 2026

Unable to authenticate your request. Please make sure to connect your GitHub account to Cursor. Go to Cursor

grypez and others added 2 commits March 17, 2026 15:23
Add @ocap/kernel-language-model-service — a new package that bridges the
kernel with OpenAI-compatible language model backends.

Core types and kernel service:
- ChatParams / ChatResult / ChatService types for request/response shapes
- makeKernelLanguageModelService: registers a kernel service object that
  dispatches chat calls to a provided chat function
- LANGUAGE_MODEL_SERVICE_NAME constant

Backends:
- Open /v1 backend with SSE streaming (makeOpenV1NodejsService)
- Ollama Node.js backend (OllamaNodejsService)

Client API:
- makeChatClient: returns an OpenAI-SDK-compatible client object that
  routes chat.completions.create() calls through a CapTP ChatService
- makeSampleClient: convenience wrapper for single-token sampling

Test utilities:
- makeMockOpenV1Fetch: deterministic SSE mock for unit/integration tests

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Add an integration test and e2e test that exercise the full
kernel → LMS service → Ollama round-trip via a bundled vat.

- lms-chat-vat: bundled vat that sends a single chat message and logs
  the response, used to verify the round-trip through the kernel
- lms-chat.ts: shared test helper (runLmsChatKernelTest)
- lms-chat.test.ts: uses a mock fetch — CI-safe, no network
- lms-chat.e2e.test.ts: uses real fetch against local Ollama
- agents.e2e.test.ts: json/repl agent e2e tests against local Ollama
- test/suite.test.ts: pre-flight check that Ollama is running

Lay out the package to match kernel-test: all vats, helpers, and test
files live under src/; test/ holds only the Ollama pre-flight suite.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@grypez grypez force-pushed the grypez/language-model-client branch from 2dffa25 to ea9f656 Compare March 17, 2026 19:51
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, have a team admin enable autofix in the Cursor dashboard.

chat: async (params: ChatParams) => service.chat(params),
sample: async (params: SampleParams) => service.sample(params),
});
};
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New factory function exported but never reachable

Low Severity

makeOllamaNodejsKernelService is exported from ollama/nodejs.ts and its JSDoc says it's "for use with makeKernelLanguageModelService", but it is never re-exported from index.ts and has zero usages anywhere in the codebase. Consumers of the package have no path to this function through the public API.

Additional Locations (1)
Fix in Cursor Fix in Web

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant