- 
In src/modules/llm.ts, create an instance of theChatOpenAIexported from@langchain/openai, using theOPENAI_API_KEYenvironment variable set in.env.local.
- 
The test in src/modules/llm.test.tsimports this variable and calls the invoke method. Try experimenting with the prompt.
View Solution
typescript
src/modules/llm.ts
import { ChatOpenAI } from "@langchain/openai";
export const llm = new ChatOpenAI({
  openAIApiKey: process.env.OPENAI_API_KEY,
  modelName: "gpt-4o",
  temperature: 0,
});To verify this challenge, run the following test:
sh
Run Test
npm run test llmIf all tests pass, hit the button below to continue. If anything goes wrong, shout!
Summary
Good job, you’re ready for the next challenge.