- Around 10-15 hours per week
- In PGM, why the word model? (Keeps data, algorithm, domain knowledge separate)
- Why the word probabilistic? (Handles uncertainity - partial knowledge of the world, noisy observations, phenomena not covered by the model, and inherent stochasticity.)
- Why graphical? (From the perspective of CS - uses knowledge of graphs to define complex systems. Joint distributions over random variables.)
- What are two primary classes of PGM? (Bayesian networks and Markov networks)
- What are Bayesian networks? (One of the two main classes of PGM which uses a directed graph for representation. Random variables are represented as nodes in the graph. The edges represent the probabilistic connections between the edges.)
- What are Markov networks? (The other class is known as Markov networks which uses undirected graphs.)
- What are the advantages of using graphical model? (1. intuitive and compact data structure 2. efficient reasoning using general-purpose algorithms 3. Sparse parameterization (feasible elicitation and learning from data))
- What is a joint distribution?
- What are parameters and independent parameters in the context of a joint probability distribution?
- What is conditioning? What is reduction?
- After reduction, is it still a probability distribution? (No, doesn't sum to 1) If not, how to convert it to a probability distribution? (Renormalization)
- What is marginalization? (A procedure that takes a probability distribution over a larger set of variables and produces a probability distribution over a smaller subset of them)
- What is a factor?
- What is the scope of a factor? (the variables (not constants) in its domain)
- How to perform a product of two factors?
- What is factor marginalization?
- How to marginalize a factor?
- What to perform factor reduction?
- Why are factors useful? (1. fundamental building blocks for defining distributions in high-dimensional spaces 2. set of basic operations for manipulating these probability distributions)
- What are steps in solving a problem? (1. Define a graph based on the problem at hand. 2. Define CPDs at each node)
- What is a Bayesian network? (A DAG whose nodes represent r.v.s and for each node there is CPD defining its dependency on parents)
- What is the chain rule formulae for BN?
- Prove that Bayesian Network is a legal probability distribution.
- What is causal reasoning? (reasoning goes from top to bottom in the graph)
- What is evedential reasoning? (reasoning goes from bottom to top in the graph)
- What is intercausal reasoming? (reasoning happens in a lateral direction)
- How does nodes that have no lateral connectons still related in an inter-causal way?
- What is an active trail in a graph when nothing is observed?
- When and how can influence flow in graph when no node is observed? Which case doesn't lead to any influence flow?
- When one or more node/nodes is/are observed, how does the above scenario change?
- What is an active trail in a graph when some nodes are observed?
- What is independence?
- What is conditional independence? (Note that conditioning can both gain and lose independence)
- Why are independence and factorization related to each other?
- Explain the notion of d-separated nodes in a graph?
- What are I-maps?
- How are independencies, I-maps, and distributions related?
- What are two equivalent views of a graph structure?
- What is the assumption of Naive Bayes?
- What is the Bernoulli Naive Bayes model? (Features = number of words in dictionary = encoding absence/presence)
- What is the Multinomial Naive Bayes model? (Features = number of words in the document. independence assumption is that pdf at each word position is same)
- Where is it useful? (scenario where there are many weakly related features)
- Why don't we ever assign zero probabilities to an event?
- Why are template models required? (Structure and parameter sharing for different model and within model)
- What are Dynamic Bayesian networks? (for temporal data)
- What are object relational models? (Directed - plate models. And undirected.)
- What is the Markov assumption? Why is it needed?
- Is the Markov assumption too strong? If yes, how to handle it? (for robots don't capture velocity. Such information can be included in the state to make the situation better. Otherwise, add dependencies back in time)
- What is a semi-Markov model?
- How to bound the number of distributions? (Time-invariance)
- When doesn't time-invariance work? How to handle this situation?
- What are intra-time slice and inter-time slice edges? What are persistence edges?
- What is a 2-time slice Bayesian network?
- What is a Dynamic Bayesian network?
- What is ground or unrolled network?
- What is a HMM? Contrast with DBN? (HMMs are sub-class of DBN)
- What are plate models?
- What is explicit parameter sharing?
- What are nested plates? What are overlapping plates?
- What is collective inference?