What "Overfitting" implies for our own learning practices.

I'm not a specialist in Machine Learning at all but interested in it and have read books about it.


The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World


What interests me is the concept of "Overfitting."


Overfitting:https://en.wikipedia.org/wiki/Overfitting




When machine learning learns too much, it fails in predicting the future.

Machine learning is too good at memorizing the detail of everything that it fails in generalizing. Therefore, deep learning engineers intentionally cut some connections between layers that connect networks.






I think this has a great implication for our (human's) own learning practices.





Sometimes, we experience that hard working does not necessarily bring about good results.



Studying hard sometimes does not come along with good test results.

Working hard sometimes lowers productivity.








As I wrote before, machine learning is not only about machine but also about learning.


We don't normally question what learning is but in this age of machine learning, it's time to think about it.

Also, it's time to think about what it is that only humans can learn.