7.9: TensorFlow.js Color Classifier: Softmax and Cross Entropy

[ad_1]
In this video, I implement the last layer of the classifier model and cover the softmax activation function and cross entropy loss function.

🎥 Next Video:

🔗 Crowdsource Color Data:
🔗 TensorFlow.js:
🔗 ml5.js:

🎥 TensorFlow.js playlist:
🎥 XOR Problem:

🚂 Website:
💖 Patreon:
Store:
📚 Book recommendations:

💻

🎥 Intro to Programming:
🎥 Coding Challenges:

🔗
🔗


Posted

in

by

Tags:

Comments

21 responses to “7.9: TensorFlow.js Color Classifier: Softmax and Cross Entropy”

  1. Kelvin Zhao Avatar

    I don't seem to find categorical cross entropy under https://js.tensorflow.org/api/0.12.0/#Training-Losses, it is under Metrics instead? Why?

  2. Lukáš Kovář Avatar

    Please make ml5.js tutorial.

  3. Александр Завалишин Avatar

    Hello. Your lessons is great! Please make Russian subs

  4. Radu Bretean Avatar

    I like this tutorials but they are too short. I already watched your past videos about tenserflow, but quickly I ended up watching the most recent ones. Honestly, I wait for you to publish 3-4 videos and then I watch them all at once. The videos are so interesting that 10 minutes seems to me like "just a taste". Thank you Dan for your work and keep the tenserflow videos coming 😉

  5. Abhishek Kumar Avatar

    Thank u for this awesome video

  6. חן הולנדר Avatar

    love your channel, i have learnt a lot about machine learning and neural networks and i used your nn libary for most of my projects, although now im trying to program a bot to be good at the classic snake game but im having a lot of trobles training the machaine using the libary :/ i would love to see a video about creating a snake bot (like the flappy bird series, although i found them very diffrent and not the same tacnic at all) and i think it could help a lot of people to understand more about this kind of machine learning. thanks!

  7. Live_ Destin Avatar

    plz do how to teach programing video or somthing like a tutorial on makeing a tutorial just an idea

    great video tho i love all of them

  8. kustomweb Avatar

    It feels like every line of this code deserves its own video.

  9. Gelio Avatar

    Great job, I love your videos and I am learning a ton from them! Thanks!

    By the way, do the probabilities in the output vector have to add up to 1 (or 100%)? As far as I know it is not necessary as we may end up with a vector of [0.5, 0.6, 0.5, 0, 0.1, 0.1, 0.1, 0.1, 0.1]. I am referring to the statement in 2:52

    Am I correct?

    EDIT: Nevermind, I got my answer around 5:00. I did not know softmax does that 😀 Another thing learned!

  10. constantin ivanov Avatar

    Thank you for this video. Спасибо.

  11. Akash Balerao Avatar

    Can u do a coding challenge of building a sudoku generator and solver?

  12. bjaeken Avatar

    How do I get tensorflow or where can I download it? Would love to spend some time on it. btw nice video!

  13. Şamil Ö. Avatar

    The problem with softmax is NaN situation. I had this problem while coding a cnn model. So if you are building your own network without this library you need to look for logsoftmax.
    (I can not make a good explanation about this.)

  14. Daniel Astillero Avatar

    Watched several videos about softmax function but this is the only one that gave me the click! Thanks, Daniel!

  15. Blert Shabani Avatar

    580 views in the first minute great!

  16. Julian Nicholls Avatar

    An easy way to use const and let is always use const initially (unless you're not initialising the value) and then go back and change it to let if it becomes necessary to change it. This makes you think about every use of a mutable variable.

  17. simone icardi Avatar

    Despite the fact I will never get all the mathematical aspect, I feel I keep on having an overall understanding of all this. You are such a wizard in teaching Dan!

  18. atrumluminarium Avatar

    Fun fact: The softmax gives what is called the Boltzmann Distribution which in physics is the cornerstone of Statistical Mechanics. The entropy of a system (most commonly a gas) is the Categorical Cross-Entropy of the BD with itself

  19. returnexitsuccess Avatar

    Also the method of normalizing the vector does not account for the possibility of negative values. For example (-1, 5, 6) would be normalized to (-0.1, 0.5, 0.6) but those clearly are not all between 0 and 1. And what if you get a vector like (-1, 1) that sums to zero and thus you can't divide by the sum? Softmax solves both these problems since the output of the exponential function is always positive.

  20. Phil Boswell Avatar

    I take it you haven't yet recorded the SoftMax video? I couldn't find it on your channel…

Leave a Reply

Your email address will not be published. Required fields are marked *