How GPT-3 works on the examples.

Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory.

Your phone prediction neural network predicts the next word when you write your message.

next word prediction

What is the Next Word Prediction?
The easiest way to write the sentence is to type the first word, and then select the second word by using predictive text. It is fast and very easy. It is also a great feature for mobile users but there is one problem when using this solution. The predictive text only works for the next word, it will not help you with the second word.

How GPT-3 Works on next word prediction?
The Graph Predictive Text uses a neuron network to predict the next word. This is done by using the connections between the words. The Graph Predictive Text has several layers.
Graph Predictive Text has several layers of connections:
The first layer is the Input Layer.
This layer has all the words of the dictionary. The input layer has connections with the next layer, it is called the Hidden Layer.
The next layers are the Hidden Layers.
Their layers are used to predict the next word by using the words in the previous layer.
The next layer is the Output Layer.

Neural network example

This layer is used to predict the next word by using the words in the input layer.

Now we are ready to predict the next word.
The first step is to check the input word. The Input word is checked in the Input Layer to find the input word. The Input word is checked in the Hidden Layer to find the next word. The next word is checked in the Output Layer to find the next word.
The second step is to calculate the score for each word. To calculate the score the Graph Predictive Text calculates the score for each word and the best word is selected.
The third step is to predict the best word. The best word is selected from the output layer.
Now we are ready to predict the next word.

GPT-2 visualized example

Text above was generated by AI © inite.io

--

--

--

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

A Quantum Variant for solving Lights Out Problem

Deep Learning Multiview Stereo (MVS)

Insights from EMNLP 2017

Computer Vision: Lane Finding Through Image Processing

Markov Switching Model

Linear Regression In Machine Learning

Detecting Rectangles In Images Using Apple’s Vision Framework

Video Classification with Deep Learning

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Andrei Ivanouski

Andrei Ivanouski

More from Medium

What Time Do the Gates Open?

Analyzing LinkedIn’s feed

Do you know a voicebot that provides real time conversational experience for Humanoid robots and…

What I learned from 𝐜𝐨𝐧𝐯𝐞𝐫𝐬𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐦𝐚𝐫𝐤𝐞𝐭𝐢𝐧𝐠 course