Skip to content

部署

llm-ops作为一款支持commonjsesmts的框架,可以在任何支持nodejs的环境中运行,包括linuxmacoswindows等。

由于部分使用的依赖问题,暂不支持直接在浏览器中部署,之后会改进。

next/nuxt

虽然llm-ops暂时无法直接在浏览器中使用,但是一般也不可能直接在浏览器中使用,防止api-key泄露。

在前端,可以使用next或者nuxt来进行SSR/CSR渲染,核心功能防止服务端,前端调用对应函数即可。

这里以next框架为例,介绍如何在前端使用llm-ops

完整项目代码可见Github

js
"use server";
import { Pipeline, PipeRegistry, LLM_OPS_CONFIG, EventEmitter } from "llm-ops";
import { useAppContext } from "@/lib/AppContext";
export const llm_ops_run = async (
  schema,
  inputText,
  openaiKey,
  heliconeKey,
) => {
  try {
    LLM_OPS_CONFIG.OPENAI_API_KEY = openaiKey || "";
    !!heliconeKey && (LLM_OPS_CONFIG.HELICONE_AUTH_API_KEY = heliconeKey);
    !!heliconeKey &&
      (LLM_OPS_CONFIG.OPEN_PATH = {
        baseURL: "https://oai.hconeai.com/v1",
        defaultHeaders: {
          "Helicone-Auth": `Bearer ${heliconeKey}`,
        },
      });
    const funcStore = PipeRegistry.init();
    const schemaJson = JSON.parse(schema);
    const pipeline = Pipeline.fromJSON(schemaJson, {}, funcStore);
    const res = await pipeline.execute(inputText);
    return res;
  } catch (e) {
    throw e;
  }
};
"use server";
import { Pipeline, PipeRegistry, LLM_OPS_CONFIG, EventEmitter } from "llm-ops";
import { useAppContext } from "@/lib/AppContext";
export const llm_ops_run = async (
  schema,
  inputText,
  openaiKey,
  heliconeKey,
) => {
  try {
    LLM_OPS_CONFIG.OPENAI_API_KEY = openaiKey || "";
    !!heliconeKey && (LLM_OPS_CONFIG.HELICONE_AUTH_API_KEY = heliconeKey);
    !!heliconeKey &&
      (LLM_OPS_CONFIG.OPEN_PATH = {
        baseURL: "https://oai.hconeai.com/v1",
        defaultHeaders: {
          "Helicone-Auth": `Bearer ${heliconeKey}`,
        },
      });
    const funcStore = PipeRegistry.init();
    const schemaJson = JSON.parse(schema);
    const pipeline = Pipeline.fromJSON(schemaJson, {}, funcStore);
    const res = await pipeline.execute(inputText);
    return res;
  } catch (e) {
    throw e;
  }
};

laf

llm-ops可以直接部署在serveless云函数中,这里以laf为例,介绍如何在laf中使用llm-ops

ts
import cloud from "@lafjs/cloud";
import { Pipeline, PipeRegistry, SerializablePipelineOptions } from "llm-ops";
export default async function (ctx: FunctionContext) {
  const { schema, input } = ctx.body;
  if (!schema || !input) {
    return { code: "001", msg: "错误输入", data: "" };
  }
  console.log("schema", schema);
  const funcStore = PipeRegistry.init();
  if (!!schema.pipes) {
    const pipeline = Pipeline.fromJSON(
      schema as SerializablePipelineOptions,
      {},
      funcStore,
    );
    const res = await pipeline.execute(input);
    return { code: "200", msg: "", data: res };
  } else {
    return { code: "002", msg: "schema.pipes错误输入", data: "" };
  }
}
import cloud from "@lafjs/cloud";
import { Pipeline, PipeRegistry, SerializablePipelineOptions } from "llm-ops";
export default async function (ctx: FunctionContext) {
  const { schema, input } = ctx.body;
  if (!schema || !input) {
    return { code: "001", msg: "错误输入", data: "" };
  }
  console.log("schema", schema);
  const funcStore = PipeRegistry.init();
  if (!!schema.pipes) {
    const pipeline = Pipeline.fromJSON(
      schema as SerializablePipelineOptions,
      {},
      funcStore,
    );
    const res = await pipeline.execute(input);
    return { code: "200", msg: "", data: res };
  } else {
    return { code: "002", msg: "schema.pipes错误输入", data: "" };
  }
}

借助laf的能力,我们能够迅速将其部署到云函数中,实现llm-ops的在线调用。

完整的laf函数代码可见函数市场

Released under the MIT License.