NLP

The true power of NLP? It’s hidden in its limits

3/23/2021
Read in ITALIAN

Natural Language Processing: machines that emulate us

The interaction between human beings and machines - at every level - will be crucial in respect to the future configuration of our society and economics. This topic is very important to us and this article will present an overview of NLP, a technology that is closely related to our work at Aptus.AI.

Visit our blog

The Machine Learning challenge: be more human… to learn better

It is generally known that NLP systems have been constantly empowered in the last few years, but the reason for this evolution is not so notorious. We will need to start from the origins of Natural Language Processing and take a look at the linguistic world to explain this trend. The first Machine Learning technologies were developed to learn from specific tasks, hence obliged to start each time from scratch. That way machines were not actually able to learn a language, but only to train in performing that specific task with ever greater accuracy. Anyway, this is not the way humans learn languages. In fact, kids learn by being exposed to a certain language. Thanks to this linguistic exposition, human beings learn to complete sentences, showing their acquired knowledge about the language structure, but also about the meaning and the context of the words. And it is precisely by imitating this human way of learning that the most recent NLP systems are capable of predicting the next word in a sentence, creating something called language models.

Natural Language Processing evolution and state-of-the-art

Talking strictly in linguistic terms, these theories had already emerged in the late 50s of ‘900, with the so-called distributional semantics hypothesis by J.R. Firth, according to whom - quoting the words of Professor Alessandro Lenci - “The degree of semantic similarity between two linguistic expressions A and B is a function of the similarity of the linguistic contexts in which A and B can appear”. Many years and technological advances later, the limits that prevented from applying this theory also to calculating machines have been overcome. On the basis of the distributional hypothesis the first pre-neural models were created: Bag of Words (BoW), TF-IDF, Latent Semantic Analysis (LSA). Then, since 2013, neural models like Word2Vec have appeared, until the release of BERT (2018) - more complex and efficient - and of all the Transformers models (waiting to write a post about them, we recommend you this article). Currently NLP systems do not start from scratch for any new task, but from a language model to which the new tasks are only added. Without going too deep in this topic (at least in this post), the same principle is exploited also for images. This is why an interesting article by Facebook AI introduces the expression “dark matter of intelligence” referring to the reproduction of these human learning dynamics - quite close to the common sense concept -, which represent the most complex aspect of Machine Learning. Obviously the greater is the number of neural networks used to emulate the human way of learning, the higher will be the machines’ capability of learning. It is also obvious that more and more powerful machines are required to do that, but the path is clear. As it is proved by the Guardian article entirely written by a bot, exploiting the GPT-3 model.

Aptus.AI's challenge: integrating NLP and Document AI

The systems described above, however, present evident limits. For instance, they are capable of working only on documents that have a well-defined beginning and end, not too long nor too complex in terms of internal structure. Concretely, a PDF file - which is not machine readable - or a document that is too elaborated would be unusable by machines. Besides, NPL systems are not capable of taking into account the structure of a document, as they work only on plain text. 

Being aware of that, at Aptus.AI we have developed systems that integrate NLP and Document AI (which we will present in a new blog post). Exploiting these two technologies in an integrated way, making them interact, allow them to add value to each other. This is how we gave birth to Daitomic, our AI solution created for banking compliance management.

DAITOMICMANIFESTOTEAMCAREERSBLOGCONTACTS
ITA
FOLLOW US