What "Overfitting" implies for our own learning practices.
I'm not a specialist in Machine Learning at all but interested in it and have read books about it.
人工知能はどのようにして 「名人」を超えたのか?―――最強の将棋AIポナンザの開発者が教える機械学習・深層学習・強化学習の本質
- 作者: 山本一成
- 出版社/メーカー: ダイヤモンド社
- 発売日: 2017/05/11
- メディア: 単行本(ソフトカバー)
- この商品を含むブログ (9件) を見る
The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
- 作者: Pedro Domingos
- 出版社/メーカー: Penguin
- 発売日: 2017/01/26
- メディア: ペーパーバック
- この商品を含むブログを見る
What interests me is the concept of "Overfitting."
Overfitting:https://en.wikipedia.org/wiki/Overfitting
When machine learning learns too much, it fails in predicting the future.
Machine learning is too good at memorizing the detail of everything that it fails in generalizing. Therefore, deep learning engineers intentionally cut some connections between layers that connect networks.
I think this has a great implication for our (human's) own learning practices.
Sometimes, we experience that hard working does not necessarily bring about good results.
Studying hard sometimes does not come along with good test results.
Working hard sometimes lowers productivity.
As I wrote before, machine learning is not only about machine but also about learning.
We don't normally question what learning is but in this age of machine learning, it's time to think about it.
Also, it's time to think about what it is that only humans can learn.