Mathematical models of natural language semantics oscillate between the two opposing approaches of word-based statistical and sentence-based compositional.
Word-based models rely on the ideas of Harris and Firth that words occurring in similar contexts have similar meanings. They gather co-occurrence information for words from large corpora of data, but the theory supporting them does not scale to sentences. Compositional models, in the sense of Montague 1970 and type-logical calculi of Lambek and van Benthem, systematically associate the steps of a syntactic derivation with semantic operations acting on the interpretations of the constituents. With respect to word meanings, the compositional approach is agnostic: the meaning of life is LIFE.
There has been a recent wave in combining these approaches to obtain vector representations for meaning of sentences in a compositional way, but these models are still in their infancy, either modelling composition by a structure-forgetting operation such as vector addition, or restricting attention to small fragments of language such as adjective noun combinations and transitive sentences. In the mean time, within the compositional type-logical group a variety of techniques have been developed to overcome the expressive limitations of the original calculi, resulting in grammar logics that can face the computational confrontation with real data.
This workshop is an attempt to bring together active researchers of these fields to address problems of both theoretical and practical nature. One major goal is to develop systems where both word vectors and complex grammatical structures can be reasoned about in a compositional and computationally tractable way.
Visit the Workshop Website for more information