Author Archives: Naren Sundar

On Math Writing

I recently came across a wonderful set of notes on mathematical writing. It is a set of notes written by students who took Knuth’s class on mathematical writing. The link is here. It’s wonderful to see the amount of thought … Continue reading

Posted in Uncategorized | Tagged , | Leave a comment

Paper: A Generative Model for Parsing Natural Language to Meaning Representations (67/365)

Today’s paper once again constructs a semantic parser. The parser is trained on sentences paired with their meaning representations. But there is no finer labeling of the correspondence between words and meaning tokens. The meaning representation in this paper takes … Continue reading

Posted in Uncategorized | Tagged , , | Leave a comment

Paper: PPDB: The Paraphrase Database (66/365)

In the last post, the authors made use of paraphrases. It turns out that there is in fact a paraphrase database and it’s quite interesting how it was created. It starts with the basic observation that given translated texts from … Continue reading

Posted in Uncategorized | Tagged , , | Leave a comment

Paper: Building a Semantic Parser Overnight (65/365)

In a previous post I talked about a paper that encoded questions as programs which in turn defined a procedure that executed in the environment of a battleship game. In other words, each question defines a program that defines the … Continue reading

Posted in Uncategorized | Tagged , , | 1 Comment

Paper: Concepts in a Probabilistic Language of Thought (64/365)

In a previous post, I described a paper that wrote questions as a LISP expression. The authors of that paper take their input from today’s paper by Goodman et. al. I think it’s clear to everyone that concepts, as humans … Continue reading

Posted in Uncategorized | Tagged , , | Leave a comment

Paper: Attention Is All You Need (63/365)

I can’t believe after my gripe yesterday that I have picked today’s paper. This paper claims that a feed-forward network with self-attention trains faster and performs better than a recurrent or a convolutional neural network on translation tasks. I have … Continue reading

Posted in Uncategorized | Tagged , , | 1 Comment

Paper: Question Asking as Program Generation (62/365)

I am very bored by language models that just pump the training data through something and then predict the word following a partially revealed sentence. I mean, I don’t learn anything from them apart from a lot of technical wizardry. … Continue reading

Posted in Uncategorized | Tagged , , | 1 Comment

A History of Natural Language Processing

Last weekend, I gave a talk at the Bangalore Computer Vision Meetup group on “A history of Natural Language Processing”. I tried to compile together a perspective of the subject that may be unfamiliar to many. Take a look here: … Continue reading

Posted in language | Tagged , | Leave a comment

DSL for Generative Models – Next Steps (61/365)

I’m going to sketch out the next few things to plan and code for the DSL library. I have so far provided the ability to describe the network but I haven’t yet provided a way to describe the distributions. For … Continue reading

Posted in Uncategorized | Leave a comment

Lexicographical Ordering (60/365)

> import Data.Ord (comparing) This is completely random but I thought it was a neat use of laziness. You are familiar with lexicographical ordering? Haskell’s compare of lists implements this. ghci> [1,2] < [1,3] True ghci> [1,3,4] < [1,3,4,5] True … Continue reading

Posted in Uncategorized | Tagged , | Leave a comment