krishna's site

NotesProjects
Home

❯

Notes

❯

AI ML

❯

LLMs

❯

Transformers (attention is all you need type stuff)

Transformers (attention is all you need type stuff)

Oct 13, 20251 min read

different embedding techniques and embedding/encoding models transformer basics ~~transformer ++ ~~ differnet kinds of attention llm inferencing techniques finetuning methods incereasing context and post training continued learning

llm ops

the architecture basics and how everything work in theory different types of attention and implementations


Created with Quartz v4.5.2 © 2025

  • GitHub
  • Discord Community