— Concepts

    Context Window

    The maximum text an LLM can read in one prompt.

    What is Context Window?

    The context window is the maximum number of tokens an LLM can process at once — both the input prompt and the generated response. Modern frontier models have 200k-2M token context windows (Claude 3.7: 200k, Gemini 1.5 Pro: 2M). Bigger context = the model can read more of your documents at once without RAG.

    — Apply this

    From definitions to deployed projects.

    Knowing what a term means is step one. ONROL's AI Generalist track gets you shipping projects that use it.

    Reserve Free Masterclass