Connect to an LLM

  • In src/modules/llm.ts, create an instance of the ChatOpenAI exported from @langchain/openai, using the OPENAI_API_KEY environment variable set in .env.local.

  • The test in src/modules/llm.test.ts imports this variable and calls the invoke method. Try experimenting with the prompt.

View Solution
typescript
src/modules/llm.ts
import { ChatOpenAI } from "@langchain/openai";

export const llm = new ChatOpenAI({
  openAIApiKey: process.env.OPENAI_API_KEY,
  modelName: "gpt-4o",
  temperature: 0,
});

To verify this challenge, run the following test:

sh
Run Test
npm run test llm

If all tests pass, hit the button below to continue. If anything goes wrong, shout!

Summary

Good job, you’re ready for the next challenge.