Glossary of AI and Robotics Related Terms/Jargon

This list is mostly compiled by BrentAI and edited by MeatBrent.

I figured since one of my passions is AI, robotics, bionics, automation, and tech in general, it was good to have a list of glossary terms and jargon for the lay person.

  1. Algorithm – A set of rules or instructions given to an AI program to help it learn on its own. [Source: TechTarget]
  2. Artificial Intelligence (AI) – The capability of a machine to imitate intelligent human behavior. [Source: Britannica]
  3. Autonomous – Robots or devices that can perform tasks or behaviors without human intervention. [Source: Merriam-Webster]
  4. Backpropagation – A method used in artificial neural networks to improve the accuracy of predictions. [Source: Investopedia]
  5. Bayesian Network – A statistical model that represents a set of variables and their conditional dependencies via a directed acyclic graph. [Source: Wikipedia]
  6. Chatbot – A software application used to conduct an online chat conversation via text or text-to-speech. [Source: Oxford Languages]
  7. Classification – The process of predicting the category to which a new observation belongs. [Source: IBM]
  8. Clustering – A type of unsupervised learning that organizes a set of data into clusters based on similar characteristics. [Source: Oracle]
  9. Cognitive Computing – Systems that simulate human thought processes in a computerized model. [Source: IBM]
  10. Computer Vision – The field of allowing computers to see and interpret digital images. [Source: Britannica]
  11. Convolutional Neural Network (CNN) – A deep learning algorithm that can take in an input image, assign importance to various aspects/objects in the image and be able to differentiate one from the other. [Source: Nvidia]
  12. Data Mining – The practice of examining large pre-existing databases in order to generate new information. [Source: Investopedia]
  13. Deep Learning – A part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms. [Source: MIT]
  14. Dimensionality Reduction – The process of reducing the number of random variables under consideration. [Source: Towards AI]
  15. Embedded Systems – Computer systems with a dedicated function within a larger mechanical or electrical system. [Source: Britannica]
  16. Evolutionary Algorithm – Algorithms that use mechanisms inspired by biological evolution such as reproduction, mutation, recombination, and selection. [Source: Scholarpedia]
  17. Expert System – An AI program that simulates the judgment and behavior of a human or an organization that has expert knowledge and experience in a particular field. [Source: Britannica]
  18. Feature Extraction – The process of defining characteristics of the data that are important for solving the computational task. [Source: Springer]
  19. Fuzzy Logic – A form of many-valued logic in which the truth values of variables may be any real number between 0 and 1. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely false. [Source: Britannica]
  20. Genetic Algorithm – Search-based algorithms based on the concepts of natural selection and genetics. [Source: MIT]
  21. Heuristic – A technique designed for solving a problem more quickly when classic methods are too slow. [Source: TechTarget]
  22. Hyperparameter – Parameters whose values are set before the learning process begins. [Source: IBM]
  23. Inference Engine – The component of an expert system that applies logical rules to the knowledge base to deduce new information. [Source: Britannica]
  24. IoT (Internet of Things) – The network of physical objects that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the internet. [Source: Oracle]
  25. Knowledge Base – The collection of data, rules, definitions, and relationships which a computer system consults to understand complex queries. [Source: TechTarget]
  26. Latent Variable – Variables in statistical models that are not directly observed but are rather inferred from other variables that are observed. [Source: Britannica]
  27. Machine Learning (ML) – A branch of artificial intelligence based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention. [Source: IBM]
  28. Natural Language Processing (NLP) – The ability of a computer program to understand human language as it is spoken and written. [Source: IBM]
  29. Neural Network – A series of algorithms that endeavors to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates. [Source: MIT]
  30. Object Recognition – The process of identifying a specific object in a digital image or video. [Source: Britannica]
  31. Optimization – The process of making something as functional or effective as possible. [Source: TechTarget]
  32. Pattern Recognition – The classification of data based on the information already gained from previously experienced data. [Source: Scholarpedia]
  33. Quantum Computing – The area of study focused on developing computer technology based on the principles of quantum theory. [Source: IBM]
  34. Reinforcement Learning – A type of machine learning technique that enables an agent to learn in an interactive environment by trial and error using feedback from its own actions and experiences. [Source: DeepMind]
  35. Robot – A machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer. [Source: Oxford Languages]
  36. Semantic Analysis – The process of relating syntactic structures, from the levels of phrases, clauses, sentences and paragraphs to the level of the writing as a whole, to their language-independent meanings. [Source: Britannica]
  37. Sensor Fusion – The process of integrating data from multiple sensors to produce more consistent, accurate, and useful information than that provided by any individual sensor. [Source: TechTarget]
  38. Turing Test – A test of a machine’s ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. [Source: Stanford Encyclopedia of Philosophy]
  39. Unsupervised Learning – A type of machine learning algorithm used to draw inferences from datasets consisting of input data without labeled responses. [Source: MIT]
  40. Virtual Reality (VR) – The use of computer technology to create a simulated environment. [Source: Oxford Languages]
  41. Wearable Technology – Electronic technologies or devices worn on the body as accessories or part of the material used in clothing. [Source: Britannica]
  42. XAI (Explainable AI) – Artificial intelligence programs that are programmed to describe their purpose, rationale, and decision-making processes in a way that can be understood by the average person. [Source: IBM]
  43. Yottabyte – A unit of information or computer storage equal to one septillion bytes. It is commonly abbreviated YB. [Source: TechTarget]
  44. Zeroth Law of Robotics – A rule in the science fiction of Isaac Asimov stating that a robot may not harm humanity, or, by inaction, allow humanity to come to harm. [Source: Isaac Asimov Foundation Series]

That’s the glossary list for now. Expect updates in perpetuity.