Presented by:

Ken Kahn

from University of Oxford

Consider the task of searching the Snap! manual. String matching cannot take into account synonyms, different ways of saying the same thing, or different spelling conventions. In this Snap! project sentence embeddings are used to compare the user's query with sentence fragments from the manual. Embeddings of a sentence are 512 numbers produced by a deep neural network. The embeddings of the fragments (all 1685 of them) have been pre-computed so only the embedding of the user's query is needed. Once the closest fragments have been computed we can fall back upon string search since the fragments were derived from the document being searched. This was packaged up as a sprite that when imported enables one to search the manual from within Snap! either by typing or speaking a query.

Here is the documentation.

Duration:
5 min
Room:
Plenary
Conference:
Snap!Con 2022
Type:
Lightning Talk