Skip to content

v0.136.0

Compare
Choose a tag to compare
@lgrammel lgrammel released this 07 Feb 19:09

Added

  • FileCache for caching responses to disk. Thanks @jakedetels for the feature! Example:

    import { generateText, openai } from "modelfusion";
    import { FileCache } from "modelfusion/node";
    
    const cache = new FileCache();
    
    const text1 = await generateText({
      model: openai
        .ChatTextGenerator({ model: "gpt-3.5-turbo", temperature: 1 })
        .withTextPrompt(),
      prompt: "Write a short story about a robot learning to love",
      logging: "basic-text",
      cache,
    });
    
    console.log({ text1 });
    
    const text2 = await generateText({
      model: openai
        .ChatTextGenerator({ model: "gpt-3.5-turbo", temperature: 1 })
        .withTextPrompt(),
      prompt: "Write a short story about a robot learning to love",
      logging: "basic-text",
      cache,
    });
    
    console.log({ text2 }); // same text