Survey on Transformer-based Language Models for AI in Law

feature image

🤖 Bringing order into the realm of Transformer-based language models for artificial intelligence and law 🏛, by Candida and Andrea T., our forthcoming publication with the Artificial Intelligence and Law journal, Springer Nature.

🕵‍♀️ This is supposed to be the first systematic study on problems, tasks and benchmarks in the legal AI domain for which approaches and methods have been developed upon Transformer-based language models, including the well-known BERT and GPT model families, to address various legal NLP tasks, such as caselaw and statutory article retrieval, classification, prediction, entailment, question-answering, summarization, generation, and many others.

👉🏻 Get the article from the Springer website

👉🏻 Check it out on LinkedIn

Previous post
Flying to Greece for the Mediterranean Machine Learning Summer School 🇬🇷
Next post
Threefold contribution at ECML-PKDD 2023 🇮🇹