Build an LLM Chat App Using LangGraph, OpenAI and Python—Part 2: Understanding SystemMessage

When building conversational AI with LangChain, the way you send and receive information is through messages. Each message has a specific role, helping you shape the flow, tone and context of the conversation. LangChain supports several message types: HumanMessage – represents the user’s input AIMessage – represents the model’s response SystemMessage – sets the behavior or rules forContinueContinue reading “Build an LLM Chat App Using LangGraph, OpenAI and Python—Part 2: Understanding SystemMessage”

GenAI for Beginners:  What is the Temperature parameter in a model

When working with Large Language Models (LLMs), it is essential to understand specific key parameters that influence the model’s behaviour. Two of the most critical parameters are: Temperature Top-P (nucleus) sampling value Temperature is a parameter that controls the randomness in the model’s output by affecting how the model selects the next token to generate.ContinueContinue reading “GenAI for Beginners:  What is the Temperature parameter in a model”