## Question

Neural Network Calculation: Entropy Term, etc.

Hello Dr. B,

Does the code in your book, E <- function(y,yhat) -(y*log(yhat) + (1-y)*log(1-yhat)), only account for n=1?

In our homework, I assume n=3 because I assume number of training instances = number of input nodes.

I'm also confused about Formula 2.24. Since the objective function is "arg min", I expect multiple values as arguments but all I have is

really just the sum of Entropy and L2norm.

Thanks,

Bertilla

## Answers and follow-up questions

** Answer or follow-up question 1**Dear Bertilla,

"Does the code in your book, E <- function(y,yhat) -(y*log(yhat) + (1-y)*log(1-yhat)), only account for n=1?"

Yes.

"In our homework, I assume n=3 because I assume number of training instances = number of input nodes."

OK, just do mean(E).

"I'm also confused about Formula 2.24. Since the objective function is "arg min", I expect multiple values as arguments but all I have is

really just the sum of Entropy and L2norm."

The objective function has parameters w and b (which you do not set; the optimization find them).

Michel Ballings

Sign in to be able to add an answer or mark this question as resolved.