AI

A multitasking approach to transform the NLP

6/9/2021
Read in ITALIAN

A parallelization of tasks in Machine Learning: the AI Transformers

Starting from machine readable formats and passing through Document AI, our journey in the world of NLP now reaches the stage of AI Transformers, that we have already mentioned in our previous posts. This Machine Learning model actually exists - unlike their Hollywood homonyms - and, humor aside, makes all the difference in the world of Artificial Intelligence.

Previous blog posts

Simultaneity, long memory and attention to the context… for more complex issues 

The concept that gave birth to the Transformers, in 2018, it was more of a question: why not process simultaneously - but always linearly -, the words in a sentence? The idea behind this Machine Learning model is therefore to process different inputs no longer sequentially, but in parallel. A multitasking approach - so to speak -, which distinguishes them from the previously most used model in the Natural Language Processing, namely the Recurrent Neural Network (RNN). But even more revolutionary is the capability of Transformers to take into account the text’s semantic area, exploiting their attention. This layer allows them to identify what word is more relevant in the context of a phrase, making it possible to link words even much distant within a text. Thanks to their higher accuracy, the AI Transformers are capable of facing more complex issues, therefore more difficult and previously unimaginable tasks in many fields - for instance, the Q&A one.

New understanding opportunities: Transformers at Aptus.AI

In our Manifesto we talk about “new understanding opportunities”, and the comparison illustrated here above is a tangible evidence of that. The image shows the difference between a search on financial regulatory sources made on EUR-lex portal (the platform currently in use to access European regulations) and the same search made on Daitomic - our interactive AI software for banking compliance management. At Aptus.AI we have been working for years to enhance the accuracy of our NLP systems, also exploiting the higher capacity of text understanding and generation provided by the AI Transformers. About that, if you did not know it already, we introduce you to Geppetto, the first Italian text-generation system (based on GPT-2), or also the more evolved GPT-3 (only in English). But mostly in a sector like the RegTech one, the semantic comprehension of legal texts is essential to provide a technology which can properly be defined as Artificial Intelligence. It is from here - from banking compliance management - that our path begins. A path which has as its goal the revolution of the way people interact with digital contents.

DAITOMICMANIFESTOTEAMCAREERSBLOGCONTACTS
ITA
FOLLOW US