2022-12-28 10:41:27 +00:00
|
|
|
In this paper, I present {\bf \inlp}, a new control-flow wrapper abstraction to enable the utilization of data from the internet (or a knowledge-database when offline) for existing context-needing Natural Language Processing (NLP) models to function without any given context. Internet-NLP can be used, finetuned alongside existing NLP models via its config settings and additionally its Long Short Term Memory neural network (LSTM neural network) can also be trained. Additionally incorporations of Masked Language Models (MLM) such as BERT, or LinkBERT \cite{devlin-etal-2019-bert,yasunaga-etal-2022-linkbert} can be utilized to improve search queries, and therfore retrieve more accurate and reliable data. Futhermore, Internet-NLP utilizes a LSTM, Reinforcement Learning and caches to allow for multi-turn NLP tasks, and improvement via Reinforcement Learning from user.
|
2022-12-24 16:19:11 +00:00
|
|
|
|
2022-12-28 10:41:27 +00:00
|
|
|
Internet-NLP, in basic terms, provides the context for context-needing NLP models to let them function. Internet-NLP can be improved via finetuning, and training of LSTM and Reinforcement Learning model (which can be trained alongside the NLP model), which enables for better search queries, and subsequently results. It obtains state-of-the-art (SOTA) results in QA and NLI without context.
|
2022-12-24 16:19:11 +00:00
|
|
|
|
|
|
|
Additionally in this paper, I also present new NLP and Natural Language Inference (NLI) models to assist {\bf \inlp}:
|
|
|
|
|
|
|
|
\begin{itemize}
|
|
|
|
\item Open-book question and long answer (QA) via GPT-NeoX-20b \cite{gpt-neox-library, gpt-neox-20b}
|
|
|
|
\item CrossEncoder NLI via LinkBERT \cite{reimers-2019-sentence-bert,thakur-2020-AugSBERT, yasunaga-etal-2022-linkbert}
|
|
|
|
\item Answer to context NLP via T5 \cite{https://doi.org/10.48550/arxiv.1910.10683}
|
|
|
|
\end{itemize}
|
|
|
|
|
|
|
|
Along with these models, I also present new general purpose QA and NLI datasets:
|
|
|
|
|
|
|
|
\begin{itemize}
|
2022-12-28 10:41:27 +00:00
|
|
|
\item ALotNLI made from datasets: ANLI, MultiNLI, and SNLI \cite{nie-etal-2020-adversarial,N18-1101,DBLP:journals/corr/BowmanAPM15}
|
|
|
|
\item ALotOpenBookQA made from datasets: CoQA, Natural Questions, and SQuAD \cite{DBLP:journals/corr/abs-1808-07042,kwiatkowski-etal-2019-natural,DBLP:journals/corr/abs-1806-03822}
|
2022-12-24 16:19:11 +00:00
|
|
|
\end{itemize}
|
|
|
|
|
2022-12-28 10:41:27 +00:00
|
|
|
As a result of these Internet-NLP, models and datasets the accuracy and reliability of most context-needing NLP models on most NLP tasks, especially tasks that require more factual responses with no given context increased.
|
2022-12-24 16:19:11 +00:00
|
|
|
|
|
|
|
Internet-NLP and the new NLP and NLI models, which were trained on the general-purpose datasets (ALotNLI, and ALotOpenBookQA). Internet-NLP, by default utilizes an Text-Generative model GPT-NeoX \cite{gpt-neox-library, gpt-neox-20b} for long responses and LinkBERT \cite{yasunaga-etal-2022-linkbert} for short responses. For 2 choices (for ex: True and False) Bi-Encoder NLI has been used and for multiple choices CrossEncoder will be used \cite{thakur-2020-AugSBERT}.
|
|
|
|
|
2022-12-28 10:41:27 +00:00
|
|
|
\begin{comment}
|
|
|
|
In this paper, we propose Internet-NLP, a novel control-flow wrapper abstraction that allows existing context-dependent Natural Language Processing (NLP) models to utilize data from the internet as context, enabling them to function without any given context. Using the internet as a context source is particularly useful for NLP models that require real-time or current information to perform their tasks accurately.
|
|
|
|
|
|
|
|
Internet-NLP can be fine-tuned alongside existing NLP models using its config settings and optimizations of Masked Language Models (MLM) and Text2Text Models. This can improve search queries and retrieve more accurate and reliable data. Additionally, Internet-NLP can utilize large NLP models such as GPT-3 or GPT-NeoX-20B for multi-turn NLP tasks and can be improved through Reinforcement Learning from user interactions. Caches of internet results can also be tuned to enable faster computation for repetitive tasks.
|
|
|
|
|
|
|
|
To assist Internet-NLP, we also present a suite of new NLP and Natural Language Inference (NLI) models, including GPT-NeoX-20b for open-book question and answer (QA), LinkBERT for crossEncoder NLI, T5 for a statement to query and answer to context NLP. These models have been specifically designed to work with Internet-NLP to improve the accuracy and reliability of context-dependent NLP tasks. We also introduce two new general-purpose QA and NLI datasets: ALotNLI, which is made from ANLI, MultiNLI, and SNLI, and ALotOpenBookQA, which is made from CoQA, Natural Questions, and SQuAD. These datasets provide a diverse range of contexts and information that can be used to train and evaluate the performance of Internet-NLP and the accompanying NLP and NLI models.
|
|
|
|
|
|
|
|
The results of our evaluation show that Internet-NLP significantly improves the accuracy and reliability of context-dependent NLP models on various tasks, particularly those requiring factual responses with no given context. We achieve state-of-the-art results in QA with a no-context accuracy of approximately 64.7% when tested manually on the ALotOpenBookQA dataset and with random recent events. Internet-NLP enables NLP models to stay connected to current events without requiring frequent updates or large models and datasets. Overall, the combination of Internet-NLP and the accompanying NLP and NLI models represents a significant advance in the field of NLP and has the potential to revolutionize the way that NLP models are used in real-world applications.
|
|
|
|
\end{comment}
|