Skip to content
Adaptive

Learn Semantics

Read the notes, then try the practice. It adapts as you go.When you're ready.

Session Length

~17 min

Adaptive Checks

15 questions

Transfer Probes

8

Lesson Notes

Semantics is the branch of linguistics and philosophy concerned with meaning. It investigates how words, phrases, sentences, and larger units of discourse acquire and convey meaning, and how listeners and readers interpret that meaning in context. At its core, semantics asks fundamental questions: What does it mean for a word to 'mean' something? How do the meanings of individual words combine to produce the meaning of a sentence? And how do meaning relations such as synonymy, antonymy, and entailment structure our mental lexicon and our reasoning? These questions sit at the intersection of linguistics, philosophy, cognitive science, and computer science, making semantics one of the most genuinely interdisciplinary fields of study.

Formal semantics, rooted in the work of Gottlob Frege, Bertrand Russell, and later Richard Montague, uses the tools of logic and mathematics to model how natural language expressions map to truth conditions and possible worlds. Montague Grammar demonstrated that natural language could be analyzed with the same formal rigor as artificial languages, opening the door to precise compositional theories of meaning. In parallel, lexical semantics examines the internal structure of word meaning through concepts such as prototype theory, semantic fields, thematic roles, and componential analysis. Cognitive semantics, championed by scholars like George Lakoff and Ronald Langacker, argues that meaning is grounded in embodied human experience and that metaphor and mental imagery are central to how we understand language.

In the modern era, semantics has become indispensable to computational applications. Natural language processing systems rely on semantic parsing, word embeddings, and large language models to interpret and generate human language. Formal ontologies and knowledge graphs encode semantic relationships for search engines, question-answering systems, and the Semantic Web. Whether one is studying the philosophy of reference, analyzing ambiguity in legal texts, building a chatbot, or tracing how children acquire word meaning, semantics provides the conceptual toolkit for understanding how language carries meaning from mind to mind.

You'll be able to:

  • Analyze meaning relationships including synonymy, antonymy, hyponymy, and polysemy using formal semantic frameworks and natural language examples
  • Evaluate truth-conditional and cognitive approaches to sentence meaning including compositionality, presupposition, and conversational implicature phenomena systematically
  • Compare referential, representational, and inferential theories of meaning to explain how language connects to the world
  • Distinguish between lexical semantics, compositional semantics, and pragmatic meaning in resolving ambiguity and context-dependent interpretation

One step at a time.

Key Concepts

Compositionality (Frege's Principle)

The principle that the meaning of a complex expression is determined by the meanings of its parts and the rules used to combine them. This allows speakers to understand and produce an infinite number of novel sentences.

Example: The meaning of 'The cat sat on the mat' is built compositionally: you understand 'the cat,' 'sat,' 'on,' and 'the mat' individually, and grammar tells you who did what where.

Sense and Reference

Frege's distinction between the sense (Sinn) of an expression — the mode of presentation or concept associated with it — and its reference (Bedeutung) — the actual object or truth value it picks out in the world.

Example: 'The Morning Star' and 'The Evening Star' have different senses (different descriptions) but the same reference (both refer to Venus).

Truth Conditions

The conditions under which a sentence is true or false. In formal semantics, knowing the meaning of a sentence is often equated with knowing the circumstances that would make it true.

Example: The truth conditions of 'Snow is white' are satisfied if and only if snow is, in fact, white.

Lexical Semantics

The subfield of semantics that studies the meanings of individual words, including their internal structure, meaning relations, and how word meanings are organized in the mental lexicon.

Example: Analyzing 'kill' as composed of semantic components CAUSE + BECOME + NOT ALIVE is a lexical semantic decomposition.

Entailment

A semantic relation between sentences in which the truth of one sentence necessarily guarantees the truth of another. Unlike implicature, entailment is a matter of logical necessity.

Example: 'John killed the wasp' entails 'The wasp is dead.' If the first sentence is true, the second must be true.

Presupposition

A background assumption that must be true for a sentence to be felicitous. Unlike entailments, presuppositions survive under negation.

Example: 'The king of France is bald' presupposes that there is a king of France. 'The king of France is not bald' carries the same presupposition.

Implicature

Meaning that is suggested or implied by an utterance beyond what is strictly said. Grice distinguished between conventional implicature (tied to specific words) and conversational implicature (arising from context and cooperative principles).

Example: If someone asks 'Can you pass the salt?' the literal meaning is a question about ability, but the conversational implicature is a request.

Prototype Theory

Eleanor Rosch's theory that categories are organized around central, prototypical members rather than strict necessary-and-sufficient conditions. Category membership is a matter of degree.

Example: A robin is a more prototypical bird than a penguin; people categorize 'robin' as a bird faster and more confidently.

More terms are available in the glossary.

Explore your way

Choose a different way to engage with this topic — no grading, just richer thinking.

Explore your way — choose one:

Explore with AI →

Concept Map

See how the key ideas connect. Nodes color in as you practice.

Worked Example

Walk through a solved problem step-by-step. Try predicting each step before revealing it.

Adaptive Practice

This is guided practice, not just a quiz. Hints and pacing adjust in real time.

Small steps add up.

What you get while practicing:

  • Math Lens cues for what to look for and what to ignore.
  • Progressive hints (direction, rule, then apply).
  • Targeted feedback when a common misconception appears.

Teach It Back

The best way to know if you understand something: explain it in your own words.

Keep Practicing

More ways to strengthen what you just learned.

Semantics Adaptive Course - Learn with AI Support | PiqCue