Welcome to the machine: Yoav Goldberg on NLP

6 min read

I was on the phone with my grandmother the other day when she told me, “I don’t know what you do for a living, but I’m OK with whatever it is as long as you’re happy.”

In this day and age, explaining to your grandmother what you do for a living can be tricky. There are jobs that still have fairly simple definitions: if you’re an architect, you design houses; if you’re an engineer, you build them. But there are also a number of tech-related roles that are hard to grasp for someone who didn’t grow up around them. It’s easy for my grandmother to understand that my job involves some writing, but she doesn’t get the internet side of things, let alone what Unbabel’s line of business is.

When I asked Yoav Goldberg, Senior Lecturer at Bar Ilan University’s Computer Science Department, how he would explain his job to his grandmother, I was expecting him to struggle to find an explanation that would be easy enough for the common person to understand. Yet he put it quite simply that his job consists of “making computers do something semi-smart with language that hints that some understanding took place”.

But what exactly does a Senior Lecturer in Computer Science do on a daily basis?

Yoav Goldberg has been splitting his time between two different lines of work. The first is trying to understand how deep learning models learn things and what they can or cannot learn, which for him is an important issue since “no one really understands what’s going on.” The second line of work is figuring out how to create Natural Language Processing components or tools that will be useful for people and companies whose core business isn’t NLP and who are not experts in the field, but who still need these tools to, in some way, process language.

Even though he is interested in the application of NLP in business, he feels that, generally, there is a disconnect between academia and industry. The first tends to focus on identifying and solving new problems, while the latter wants to put those results into practice. In some cases, it is possible to take academic research and apply it to different industries, but Yoav Goldberg explains that very little of that research is developed into a product.

I think that from an academic perspective, we don’t focus enough on things that actually need focusing on, like values or other metrics that are easy to optimize for, but are really disconnected from what you actually want.

Despite it not being directly his field of research, Goldberg believes machine translation is probably the best technology available right now that is related to his investigations in NLP. It’s the one that works best and that we most frequently see applied to businesses. The reason for it is, in his opinion, that it’s possible to do quite a lot with language without grasping it in its full extent. It’s simply a matter of turning an input in a language into an output in a different language. You don’t have to understand what’s in the middle.

This makes it easy to get machine translation to perform well enough to be useful. But it is when it fails that it’s necessary to step in to understand why. In an article he wrote back in 2017, Yoav Goldberg claimed to have a lot of respect for language, as opposed to “deep learning people who seem not to.” As he saw it, researchers in this field claimed to have solved language problems they clearly hadn’t.

They gave an impression that they didn’t really care what the data looked like. You just had to push some numbers, without really appreciating the subtleties of language.

One of these subtleties is context. For example, if you read in a newspaper something like “if prices go up, then something will happen,” it’s not prices in general that will go up, but rather the prices of something which is maybe expressed in the title or somewhere else along the text. In this specific sentence it’s not clear, but everyone who reads it understands what it is. Machines don’t.

Yoav Goldberg is currently leading research into solving this problem. It’s very challenging because there are many ways in which people omit information while speaking or writing. Another example is saying, “No thanks, I’ve had five already.” It’s five of something that the speaker has had before, but the noun is missing after the number. The way researchers are dealing with this is by isolating different contexts and working on each of them separately. Goldberg’s team has developed a system that can already infer this type of context relatively well, but there are many other contexts to consider and work on.

When reflecting on the future of NLP investigation and application, Yoav Goldberg says we’ll probably still be using neural networks five or even ten years from now. They work well and there is no better technology available at the moment. There is one frequent problem about neural networks though: they work when you have enough data and they don’t work when you don’t have enough data.

For the domains for which there simply isn’t enough data available, we’re at a point where it’s about making them work with less data and very few examples, but still trying to generalize them to make the systems perform well. This is a trend researchers are likely to keep following in the coming years.

As for machine translation, Yoav Goldberg doesn’t think humans will ever be completely out of the picture, mostly because neural systems don’t yet know whether they’re making mistakes. They just don’t understand language like humans do.

But Yoav Goldberg isn’t interested in making them understand, either.

I am not driven by the pursuit of intelligence, but rather by questions like, “How is language structured and how can we do useful things with it?”

These are the questions that have guided him throughout his 20-year career — questions that he didn’t quite see coming as a kid who wanted to grow up to be a wizard or a superhero. He did play a lot of video games and was always curious of how you’d type in a command to make things happen. He also says he’s always cared about language to some extent.

Maybe wizard, superhero, or even reality TV star (a few years ago he had the chance to participate in the Israeli version of Beauty and the Geek, but declined the invitation), would be easier to explain to an older relative, but Senior Lecturer in Computer Science somehow makes more sense.

ArtboardFacebook iconInstagram iconLinkedIn iconUnbabel BlogTwitter iconYouTube icon