In this post are gathered the tutorials and exercises, called “mega-recitations”, to apply the concepts presented in the Artificial Intelligence course of Prof. Patrick Winston. Continue reading “24. Reviews and Exercises”
Bayesian Story Merging
By using the probability model discovery previously studied, certain concepts and ideas can be analyzed and merged if similar.
By analyzing the correspondences between clusters of two sets of data, certain data subsets regular correspondences can be sorted out. According to Prof. Patrick Winston, this system of correspondences is very likely to be present in human intelligence.
Applications of AI
- Scientific approach: understanding how AI works
- Engineering approach: implementing AI applications
The most interesting applications are not to replace people with artificial intelligence but to work in tandem with people.
Using a lot of computing power and data becomes more common, but an interesting question is how little information is needed to work a certain problem.
The system translates stories into a internal language to understand stories and display them in diagrams. It allows to read stories on different levels, and use different “personas” to understand stories differently.
Humans may not be intelligent enough to build a machine that is as intelligent as them.
Continued from previous class
Events diagrams must always be arranged in a way so that there are final nodes and no loops. Recording probabilities in tables for each event, the tables are filled by repeating experience so as to know the probabilities and occurrences of each event.
Several models can be drawn for a given set of events. To know which model is right, the Bayesian probabilities formulas can be used to confirm if events are independent or not, make them easier to compute, and choose the more appropriate model.
- P(a/b) = P(a,b) / P(b)
- P(a/b) P(b) = P(a,b) = P(b/a) P(a)
- P(a/b) = P(b/a) P(a) / P(b)
Defining a as a class, and b as the evidence, the probability of the evidence given the class can be obtained through these formulas.
P(class/evidence) = P(evidence/class) P(class) / P(evidence)
Using the evidence from experience, classes can inferred by analyzing the results and corresponding probabilities.
Given the data from experience / simulation, the right model can be sorted as it better corresponds to the probabilities. This allows to select between 2 existing models.
However if multiple models can be created, volumes of data make it impossible to compare them all. The solution is to use two models and compare them recursively. At each trial, the losing model is modified for improvements until a model fits certain criteria for success.
A trick is to use the sum of the logarithms rather than the probabilities, as large numbers of trials will make numbers too small to compute properly.
To avoid local maxima, a radical rearrangement of structure is launched after a certain number of trials.
This Bayesian structure discovery works quite well in situations when a diagnosis must be completed: medical diagnosis, lie-detector, symptoms of aircraft or program not working…
Probabilities in Artificial Intelligence
With a joint probability table, recording the tally of crossed events occurrence will allow us to measure the probabilities of each event happening, conditional or unconditional probabilities, independence of events, etc.
The problem with such table is that as the number of variables increase, the number of rows in the table grows exponentially.
Reminders of probabilities formulas
Basic axioms of probability
- 0 ≤ P(a) ≤ 1
- P(True) = 1 ; P(False) = 0
- P(a+b) = P(a) + P(b) – P(a,b)
Basic definitions of probability
- P(a/b) = P(a,b) / P(b)
- P(a,b) = P(a/b) P(b)
- P(a/b,c) = P(a/b,c) P(b,c) = P(a/b,c)P(b/c)P(c)
Chain rule of probability
By generalizing the previous formula, we obtain the following chain rule:
P(a/b) = P(a) if a and b are independent
If a and b are independent
- P(a/b+z) = P(a/z)
- P(a+b/z) = P(a/z)P(b/z)
Causal relations between events can be represented in nets. These models highlight that any event is only dependent from its parents and descendants. Recording the probabilities at each node, the number of table and rows is significantly smaller than a general table of all events tallies.
General Problem Solver
By analyzing the difference between a current state and desired state, a set of intermediary steps can be created to solve the problem = problem solving hypothesis.
SOAR (State Operator And Result)
- Long-term memory
- Short-term memory
- Vision system
- Action system
Key parts of the SOAR architecture:
- Long-term memory and short-term memory
- Assertions and rules (production)
- Preferences systems between the rules
- Problem spaces (make a space and search through that space)
- Universal sub-goaling: new problems that emerge during the resolution become entire new goal with assertions rules, etc.
SOAR relies on the symbol system hypothesis. It primarily deals with deliberative thinking.
Created by Marvin Minsky to tackle more complex problems, this architecture involves thinking about several layers:
- Reflective thinking
- Deliberative thinking
- Learned reaction
- Instinctive reaction
It is based upon the common sense hypothesis.
System created by Rodney Brooks. By generalizing layers of abstraction in the building of robots (such as for robot vision and movement), modifications to certain layers don’t interfere with other layers computation, allowing for better incremental improvement of the system as a whole.
It primarily deals with instinctive reaction and learned reaction.
This is the creature hypothesis, if a machine can act as an insect, then it will be easy to develop further later. This architecture relies upon the following principles:
- No representation
- Use the world instead of model: reacting to the world constantly
- Finite state machines
Based upon language, this system involves perception and description of events, which then allow to understand stories and further, culture both at the macro (country, religion…) and micro (family…) levels. This system relies upon the strong story hypothesis.