Files
jlinc-langchain-js/README.md
2025-09-03 19:52:41 +00:00

3.0 KiB
Raw Blame History

JLINC Langchain Tracer

The JLINC Langchain Tracer is the official way to implement the zero-knowledge third-party auditing provided by the JLINC Server inside any Langchain-based infrastructure.

By embedding JLINCs trusted protocol directly into Langchains tracing system, organizations can prove compliance, accountability, and data integrity without ever exposing sensitive information. This seamless integration enables developers to track, verify, and audit model interactions with full transparency while preserving confidentiality through cryptographically verifiable zero-knowledge proofs. Whether for regulated industries, enterprise governance, or AI safety applications, the JLINC Langchain Tracer ensures that trust, privacy, and accountability are built in from the ground up.

Sample application

The below code sample is a demonstration of the JLINC Langchain Tracer in action. As data moves through the chain, it is cryptographically signed with a unique key for each element in the chain, and zero-knowledge audit records are delivered to the JLINC Archive Server.

const { ChatOpenAI } = require("@langchain/openai");
const { awaitAllCallbacks } = require("@langchain/core/callbacks/promises");
const { Calculator } = require("@langchain/community/tools/calculator");
const { AgentExecutor, createToolCallingAgent } = require("langchain/agents");
const { ChatPromptTemplate } = require("@langchain/core/prompts");
const { JLINCTracer } = require("../src/tracer.js");

async function main() {
  const tracer = new JLINCTracer({
    dataStoreApiUrl: "http://localhost:9090",
    dataStoreApiKey: process.env.JLINC_DATA_STORE_API_KEY,
    archiveApiUrl: "http://localhost:9090",
    archiveApiKey: process.env.JLINC_ARCHIVE_API_KEY,
    agreementId: "00000000-0000-0000-0000-000000000000",
    systemPrefix: "TracerTest",
    debug: true,
  });

  const llm = new ChatOpenAI({
    openAIApiKey: "n/a",
    configuration: {
      baseURL: "http://localhost:1234/v1",
    },
    modelName: "meta-llama-3.1-8b-instruct",
  });

  const calculator = new Calculator();
  const tools = [calculator];

  const prompt = ChatPromptTemplate.fromMessages([
    ["system", "You are a helpful assistant"],
    ["placeholder", "{chat_history}"],
    ["human", "{input}"],
    ["placeholder", "{agent_scratchpad}"],
  ]);

  const agent = createToolCallingAgent({ llm, tools, prompt });

  const agentExecutor = new AgentExecutor({
    agent,
    tools,
  });

  try {
    const r = await agentExecutor.invoke({ input: "Add 1 + 1" }, {callbacks: [tracer]});
    console.log(`\nResult`)
    console.log(`---------------------------------------------`)
    console.log(r)
  } catch (err) {
    console.error("Error calling LLM:", err);
  } finally {
    await awaitAllCallbacks();
  }
}

main()

Additional information

Full JLINC Documentation: https://docs.jlinc.io

Details of the JLINC protocol, schema, and context can be found at: https://protocol.jlinc.org/.