[TOC]
- Title: Grammar-Based Grounded Lexicon Learning
- Author: Jiayuan Mao
- Publish Year: 2021 NeurIPS
- Review Date: Dec 2021
Summary of paper
The paper extend the previous work “Neuro-Symbolic Concept Learner” by parsing the natural language questions using symbolic manner.
The core semantic parsing technique is Combinatory Categorical Grammar with CKY algorithm to prune unlikely expressions.
The full picture looks like this
The detailed algorithm process looks like this
How to derive concept embedding
e.g., “Shiny” concept is visually grounded
In the end, the concept embedding will freeze.
How to understand novel concept
actually, cannot….
but the author claimed that this model is very good at compositional generalisation task
essentially this is few-shot learning task, and the proposed neuro-symbolic model is good at few-shot learning task (of course)
Some key terms
CKY-E algorithm
Minor comments
Notes for Semantic Parsing with CCGs
CCG : Combinatory Categorial Grammar (CCG) is essential in this work.
We can learn it through Youtube videos https://www.youtube.com/watch?v=dOqe-ATkmeA&list=PLun-LUE1uLNvWi-qV-tRHohfHR90Y_cAk
Check the following notes
Lambda Calculus
Base cases
- logical constant
- represent objects, concepts, relations
- variable
- literal
- lambda term
example: check what logical form make sense
Simply Typed Lambda Calculus
Constructing Lambda Calculus Expressions
Combinatory Categorial Grammar
The grammar formalism that we will use is called combinatory categorical grammar CCG.
CCG categories
Lexicons
CCG Operations
How to parse with CCG, given an example
- step 1, use lexicon to match words and phrases with their categories
- step 2, apply operation rules
- step 3, further operation rules, e.g., composition rule
- step 4, coordinate composed adjectives
- step 5, apply coordinated adjectives to noun
Potential future work
a good baseline if we want to explore neuro-symbolic semantic parsing