In this short excursion I discuss the definition of Kolmogorov Complexity and work through a few examples.
In this episode I jump into abstract algebra.
I discuss the following topics:
1. Binary Operators
2. Closure Property
3. Commutativity
4. Associativity
5. Identity Elements
6. Inverse Elements
In this episode, I ramble on a bit about some of the parts of neural network mathematics, particularly activation functions and bias.
1. Activation Functions: https://en.wikipedia.org/wiki/Activation_function
I also talk about a book by Jeff Heaton, Introduction to the Math of Neural Networks. It's very short and simple but a nice fast read for a quick introduction to the topic. Check it out if you're interested: https://www.amazon.de/-/en/Jeff-Heaton-ebook/dp/B00845UQL6
In this episode we discuss the conditional proposition or the conditional sentence
Topics:
1. What is If P, then Q. (conditional)
2. If P, then Q (definition, antecedent, consequent)
3. Truth Table for if P, then Q.
4. Thinking about and conceptualizing the conditional in terms of promises.
5. True & False Examples
6. The Converse
7. The Contrapositive
8. The Equivalence of if P, then Q <=> ~Q, then ~P.
In this episode I talk about
1. Logical Connectives: Conjunction, Disjunction, Negation.
2. Truth Tables
3. Examples of True and False well-formed formulas using conjunction, disjunction and negation.
4. Propositional forms.
What is a Proposition?
A statement that can be true of false.
Examples:
Main Points:
Non-Proposition Examples:
Main Points:
Atomic Propositions - do not contain any other propositions - ex: It is raining.
Compound Propositions - are formed by combining logical connectives with atomic (simple) propositions - ex: I am drinking coffee and its raining outside.