The A.I. dictionary

The fields of A.I. are brimful of specialised technical jargon. It is no wonder that it is hard for computers to understand us when the research itself is incomprehensible from one field to another. So I’ve listed some translations of common terms to layman’s terms. These definitions should not be taken too serious, but are roughly true in the sense that they are used, in my opinion.

Index A – I
Press ctrl-F to search. Alphabetical order is overrated.

Philosophical concepts
intelligence = what you think it is
real intelligence = denial of previous definition
true intelligence = denial of all definability of intelligence
the AI effect = any feat of intelligence is denied once understood
consciousness = see sentience
sentience = see consciousness
common sense = applied common knowledge
symbol = a word
symbol grounding = connecting words to physical experiences
the symbol grounding problem = words are just letters without meaning
the Turing test = a question-answer game in which AI has to beat humans at being human
the Chinese Room argument = an analogy comparing a computer to a postal worker who doesn’t understand Chinese correspondence
the three laws of robotics = conflicting safety instructions for robots from a science fiction plot
the singularity = the robot apocalypse
in 15 years = beyond my ability to predict
in 50 years = when I can no longer be held accountable for my prediction

A.I. on a scale of some to infinite
Artificial Intelligence (1) = machines that do intelligent things
Artificial Intelligence (2) = Terminators
intelligent systems = AI that does not want to be associated with Terminators
narrow AI = AI designed for specific tasks
weak AI = AI with less than all abilities of a human
strong AI = AI with all abilities of a human
Artificial General Intelligence = AI with all abilities of a human
Artificial Super Intelligence = AI with greater abilities than a human
friendly AI = AI that is programmed not to kill humans despite its super intelligence

Types of A.I.
symbolic AI = any AI that uses words as units
Good Old-Fashioned AI = AI that processes words through a large number of programmed instructions
rule-based system = AI whose knowledge consists of a checklist of “if A then B” rules
Expert System = AI that forms decisions through a checklist of “if A then B” rules in a particular field of expertise
Genetic Algorithm = randomised trial-and-error simulations, repeated x1000 with whatever worked best so far
Big Data = such large amounts of data that it takes AI to make sense of it
neuron = a tiny bit of code that passes a number on to other neurons like a domino brick
Neural Network = AI that maps out patterns with digital domino bricks, then recognises things that follow the same patterns
works like the human brain = uses a neural network, only similar in a crude manner

A.I. techniques
fuzzy logic = decimal values
Markov chain = random choice of remaining options
machine learning (1) = any machines that learn
machine learning (2) = specifically neural networks that learn
deep learning = several layers of neural networks that learn
supervised learning = telling an AI what stuff is
unsupervised learning = hoping an AI will figure everything out by itself
reinforcement learning = learning through reward/punishment, often through a scoring system
training = feeding a neural network a heap of text, images or sounds to learn from

Language processing techniques
Natural Language Processing = reading text
Natural Language Generation = writing text
corpus = bunch of text
token = a word
lemma = a root word
word sense = which meaning of a word is meant: “cat” the animal or “cat” the nine-tailed whip
concept = a set of words that are related to a certain topic
bag-of-words = a listing of all the words in a text, used to categorise its topic
stop words = trivial words to be filtered out, like “the”, “on”, “and”, “etc.”
keywords = words that trigger something
intent = a computer command triggered by keywords
pattern matching = searching for keywords and key phrases in a sentence
N-grams = pairs of commonly adjacent words, used in spellchecks and speech recognition.
word vector = a list of the distances between one word and its frequently neighbouring words
Named Entity Recognition = finding names in a text
Context-Free Grammar = textbook grammar only
Part-of-Speech tagging = marking words as adjectives, verbs, nouns, etc.
grammar parser = software that marks words as adjectives, verbs, noun phrases, and how they are related
semantic parser = software that marks the roles of words: who is doing what to whom and where
parse tree = a branching list displaying the syntax of a sentence
speech acts = arbitrary categories of things one can say, like greetings, questions, commands…
discourse analysis = research that arbitrarily categorises small talk
dialogue manager = a system that tracks what was said and directs the conversation
sentiment analysis = checking whether words are in the “naughty” or “nice” list, to detect opinion or emotion
First Order Logic = writing relations between words as a mathematical notation
semantic ontology = encyclopedia for machines

Speech processing techniques
voice recognition = recognising tone and timbre of someone’s voice
speech recognition = translating speech to text
Text-To-Speech = the reverse of speech recognition
phoneme = a vowel or consonant sound
grapheme = a bundle of letters representing a spoken sound
phonetic algorithm = code that spells words the way they are pro-naun-see-ate-d

To be continued.

Advertisements

3 thoughts on “The A.I. dictionary

  1. Nice post! I’d really like to see this dictionary growing with time. It might become useful when I’m trying to phrase simple explanations to complex technical things.

    Just a minor comment, regarding “grammar parser”, I don’t know if it’s a copy-paste mistake, but grammar parser and part of speech tagger are different. Parsers usually follow the POS tagger in the pipeline, and return the syntactic structure of a sentence. There are two types of parsers: the first segments the text to noun phrases, verb phrases, etc., and the second returns the dependencies between words in the sentence (e.g. subject and object, noun and modifier).

    Like

    • It was our earlier exchange that inspired me to jot this down, among other things.

      Since grammar can’t do without POS, I rarely see that component mentioned as separate, even if technically they are. It’s usually included in the package of a grammar parser. Like a car isn’t its wheels but it does roll. I might add that a grammar parser marks subjects and objects as well, but I’m still considering the wisdom of describing a parse tree.
      I’m sure the dictionary will grow and adjust as I learn more. I added “concept” yesterday.

      Like

      • I’m glad to hear 🙂

        I think the main thing about a (constituency) parser is that it groups words that together make a certain syntactic category in the sentence (e.g. determiner and noun make a noun phrase). The tree thing just means that its recursive, but that’s probably more than needed for this dictionary.

        Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s