![]() I get that I could flatten, but wouldve that just produce 80000 numclasses I want 2 classes. ![]() You measure if softmax(pred) and y are equal, but softmax(pred) can never be equal to 0. Youll notice when I measure accuracy and make predictions I reactivate the output with tf.nn.sigmoid(pred) - this is because the cost function sigmoidcrossentropywithlogits combines the activation and the loss in the same function. When I make predictions, I reactivate the outputs by passing them through the sigmoid function again. I subsequently pass them through a sigmoid function ( sigmoidcrossentropywightlogits ) - thus squashing the large negative values to 0 and the large positive values to 1. When I print predictions after every training iteration, the magnitude of the values start large - which is correct considering they are all passed through ReLus - but after each iterations, the values get smaller until they are roughly between 0 and 2. ![]() Atomic Covert Dropout Upgrade Your BrowserĪtomic Covert Dropout Upgrade Your Browser.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |