00:47:20 Anze Slosar: But any Taylor expansions has a radius of convergence. How can we know that the expansion is valid? 00:49:02 Jan: Is it actually a taylor expansion, or just a polynomial? That's not the same thing. 00:53:15 Van Dam, Hubertus: Just a comment: I also think that f is typically a simple function with sensible derivatives (in some cases even a finite number of non-zero derivatives). 00:55:56 Milind Diwan: Could you quickly review the 3 conditions again 01:07:00 Van Dam, Hubertus: When you consider “cat” pictures then you look at only binary classifiers. If you want to classify pictures of “cat”, “dog”, “horse”, “raccoon”, etc. then you have two options: 01:07:15 Van Dam, Hubertus: 1. A multivalue classifier 01:07:30 Van Dam, Hubertus: 2. Multiple binary classifiers. 01:07:40 Van Dam, Hubertus: Can you comment on that? 01:14:30 mccorkle: Are you making assumptions about threshold functions in these nodes (ie that they are sigmoid and well-behaved?) 01:45:30 mccorkle: The early paper(s) on multi-layer perceptrons nicely show what the nodes are doing - basically cutting the parameter space into half-plains, returning 1 if the vector position is above the cutoff and 0 if lower 01:51:30 mccorkle: A great intro book is “Perceptrons” by Marvin Minksy and Seymour Papert (1969) 02:02:25 Milind Diwan: Thanks George, Daniel. Thanks for this glimpse of analysis. 02:03:17 Van Dam, Hubertus: Thanks Daniel.