The fields of A.I. are brimful of specialised technical jargon. It is no wonder that it is hard for computers to understand us when the research itself is incomprehensible from one field to another. This post lists translations of common AI terms to laypeople’s terms. These definitions should not be taken too seriously, but are representative of how they are used in practice.
Index A – I
Press ctrl-F to search. Alphabetical order is overrated.
intelligence = what you think it is
real intelligence = denial of previous definition
true intelligence = denial of all definability of intelligence
the AI effect = any feat of intelligence is denied once understood
consciousness = see sentience
sentience = see consciousness
common sense = applied common knowledge
symbol = a word
symbol grounding = connecting words to physical experiences
the symbol grounding problem = words are just letters without meaning
the Turing test = a text-based question-answer game in which AI has to beat humans at sounding human
the Chinese Room argument = an analogy comparing a computer to a postal worker who doesn’t understand Chinese correspondence
the three laws of robotics = conflicting safety instructions for robots from a science fiction plot
the singularity = the robot apocalypse
Moore’s law = the trend that computer speed doubles every two years due to thinner transistors. This is expected to hit the physical limit of 1 atom around 2025.
in 15 years = beyond my ability to predict
in 50 years = when I can no longer be held accountable for my prediction
A.I. on a scale of zero to infinite
Artificial Intelligence (1) = machines that do intelligent things
Artificial Intelligence (2) = Terminators
intelligent systems = AI that does not want to be associated with Terminators
smart systems = automated devices using sensors or internet data, not AI
algorithm = a series of mathematical instructions
narrow AI = AI designed for specific tasks
weak AI = AI with fewer than all abilities of a human
strong AI = AI with all abilities of a human
Artificial General Intelligence = AI with all abilities of a human
Artificial Super Intelligence = AI with greater abilities than a human
friendly AI = AI that is programmed not to kill humans despite its superior intelligence
Types of A.I.
symbolic AI = any AI that uses words as units
Good Old-Fashioned AI = AI that processes words through a large number of programmed instructions
rule-based system = AI whose knowledge consists of a checklist of “if A then B” rules
Expert System = AI that forms decisions through a checklist of “if A then B” rules composed by field experts
chatbot = a program accessible through a text chat interface, not necessarily AI or conversational
Big Data = such large amounts of data that it takes AI to make sense of it
neuron = a tiny bit of code that passes a number on to other neurons like a domino brick
Neural Network = AI that maps out patterns with digital domino bricks, then recognises things that follow similar patterns
works like the human brain = uses a neural network, only similar in an abstract way
Genetic Algorithm = randomised trial-and-error simulations, repeated x1000 with whatever worked best so far
fuzzy logic = decimal values
Markov chain = random choice of remaining options
machine learning (1) = machines that learn through any means
machine learning (2) = machines that learn through neural networks
deep learning = consecutive layers of neural networks that learn, from crude to refined
supervised learning = telling an AI what stuff is
unsupervised learning = hoping an AI will figure everything out by itself
reinforcement learning = learning through reward/punishment, often through a scoring system
training = feeding a neural network many example texts, images, or sounds to learn their similarities
overfitting = memorising the training examples too precisely
underfitting = generalising the training examples too broadly
Language processing techniques
Natural Language Processing = reading text
Natural Language Generation = writing text
corpus = bunch of text
token = a word or punctuation mark
lemma = a root word
word sense = which meaning of a word is meant: “cat” the animal or “cat” the nine-tailed whip
concept = a set of words that are related to a certain topic
bag-of-words = a listing of all the words in a text, used to categorise its topic
stop words = trivial words to be filtered out, like “the”, “on”, “and”, “etc.”
keywords = preprogrammed words that trigger something
intent = a computer command triggered by keywords
pattern matching = searching for a sequence of keywords in a sentence
N-grams = pairs of commonly adjacent words, used in spellchecks and speech recognition.
word vector = a row of numbers that lists how often a particular word co-occurs with each other word.
Named Entity Recognition = finding names in a text
Context-Free Grammar = textbook grammar only
Part-of-Speech tagging = marking words as verbs, nouns, adjectives, etc.
constituency parser = software that lists a sentence’s syntax: verb phrases, noun phrases, nouns, etc.
dependency parser = software that lists a sentence’s grammar: subject, verb, object, etc.
semantic parser = software that lists who is doing what to whom in a sentence
parse tree = a branching list displaying the syntactic structure of a sentence
coreference resolution = figuring out what “he”, “she” or “it” refers to.
speech acts = arbitrary categories of things one can say, like greetings, questions, commands…
discourse analysis = research that arbitrarily categorises small talk
dialogue manager = a system that tracks what was said before and directs a chatbot’s conversation
sentiment analysis = checking whether words are in the “naughty” or “nice” list, to detect opinion or emotion
First Order Logic = writing real-world relations between words as a mathematical notation
semantic ontology = encyclopedia for machines
textual entailment = whether statement A implies statement B
Speech processing techniques
voice recognition = recognising tone and timbre of someone’s voice
speech recognition = translating speech to text
Text-To-Speech = the reverse of speech recognition
phoneme = a vowel or consonant sound
grapheme = a bundle of letters representing a spoken sound
phonetic algorithm = code that spells words the way they are pro-naun-see-ate-d
To be continued.
3 thoughts on “The A.I. dictionary”
Nice post! I’d really like to see this dictionary growing with time. It might become useful when I’m trying to phrase simple explanations to complex technical things.
Just a minor comment, regarding “grammar parser”, I don’t know if it’s a copy-paste mistake, but grammar parser and part of speech tagger are different. Parsers usually follow the POS tagger in the pipeline, and return the syntactic structure of a sentence. There are two types of parsers: the first segments the text to noun phrases, verb phrases, etc., and the second returns the dependencies between words in the sentence (e.g. subject and object, noun and modifier).
It was our earlier exchange that inspired me to jot this down, among other things.
Since grammar can’t do without POS, I rarely see that component mentioned as separate, even if technically they are. It’s usually included in the package of a grammar parser. Like a car isn’t its wheels but it does roll. I might add that a grammar parser marks subjects and objects as well, but I’m still considering the wisdom of describing a parse tree.
I’m sure the dictionary will grow and adjust as I learn more. I added “concept” yesterday.
I’m glad to hear 🙂
I think the main thing about a (constituency) parser is that it groups words that together make a certain syntactic category in the sentence (e.g. determiner and noun make a noun phrase). The tree thing just means that its recursive, but that’s probably more than needed for this dictionary.