Artificial Intelligence - An Extremely Brief Overview

According to Dictionary.com (2018) ‘Artificial Intelligence’ is “the capacity of a computer to perform operations analogous to learning and decision making in humans, as by an expert system, a program for CAD or CAM, or a program for the perception and recognition of shapes in computer vision systems.” In summary, a software that can learn in one way or another.

Artificial Intelligence (AI) has been quickly evolving and changing, getting more advanced in recent time. We started with rule-based AI where a set of rules are given for the machine to follow that can be refined and added to. The problem is the world is a complicated place. Next came Learning AI where the AI took these rules and decided on the best outcomes and refined them. Now we are moving to Neural Networks that are almost like a ‘black box’. They work similar to a brain in which they have neurons and process data based on the connections of these neurons. This however, makes it impossible to look at the process it is using and refine it manually. (Barnatt, 2017)

AI can be split into 3 parts in most cases. The learning algorithm, Test data to learn off, and some sort of hardware. In a lot of cases AI is run off a normal computers processor which is what we are using, however, specialized hardware also exists. Test data can either be given to teach the AI then the AI used, and/or The AI can learn while it is in use from the data it is given and outcomes. We are using the latter.  (Gerven & Bohte, 2018)

Further Reading:

Types of AI: 
https://simplicable.com/new/types-of-artificial-intelligence
https://futurism.com/images/types-of-ai-from-reactive-to-self-aware-infographic/

AI Hardware:
https://cloud.google.com/tpu/
https://azure.microsoft.com/en-us/services/machine-learning-studio/

AI Components:
http://www.explainingcomputers.com/ai.html


References:

Dictionary.com. (2018, May 06). artificial intelligence. Retrieved from Dictionary: http://www.dictionary.com/browse/automation

Barnatt, C. (2017). DIGITAL GENESIS - The Future of Computing, Robots and AI. explainingcomputers.com.

Gerven, M. V., & Bohte, S. (2018). Artificial Neural Networks as Models of Neural Information Processing. Frontiers in Computational Neuroscience, 5-30.