Connect to an LLM

  • In src/modules/llm.ts, create an instance of the ChatOpenAI exported from @langchain/openai, using the OPENAI_API_KEY environment variable set in .env.local.

  • The test in src/modules/llm.test.ts imports this variable and calls the invoke method. Try experimenting with the prompt.

View Solution
import { ChatOpenAI } from "@langchain/openai";

export const llm = new ChatOpenAI({
  openAIApiKey: process.env.OPENAI_API_KEY,
  modelName: "gpt-4o",
  temperature: 0,

To verify this challenge, run the following test:

Run Test
npm run test llm

If all tests pass, hit the button below to continue. If anything goes wrong, shout!


Good job, you’re ready for the next challenge.


Hi, I am an Educational Learning Assistant for Intelligent Network Exploration. You can call me E.L.A.I.N.E.

How can I help you today?