- December, 2023
- March, 2023
- November, 2022
- September, 2022
- August, 2022
- July, 2022
- March, 2022
- December, 2021
- November, 2021
- August, 2021
- March, 2021
- February, 2021
- December, 2020
- November, 2020
- October, 2020
- September, 2020
- August, 2020
- May, 2020
- April, 2020
- March, 2020
- January, 2020
- December, 2019
- August, 2019
- May, 2019
- August, 2018
- May, 2018
- March, 2018
- June, 2016
- March, 2016
- September, 2015
- August, 2015
- May, 2015
- March, 2015
- February, 2015
- January, 2015
- November, 2014
- October, 2014
- September, 2014
- August, 2014
- July, 2014
- May, 2014
- March, 2014
- February, 2014
- December, 2013
- November, 2013
- October, 2013
- September, 2013
- July, 2013
- June, 2013
- May, 2013
- March, 2013
- February, 2013
- January, 2013
- December, 2012
- November, 2012
- October, 2012
- September, 2012
- August, 2012
- April, 2012
- March, 2012
- February, 2012
- January, 2012
- December, 2011
- November, 2011
- October, 2011
- September, 2011
- August, 2011
- July, 2011
- June, 2011
- May, 2011
- April, 2011
- March, 2011
- February, 2011
- January, 2011
- December, 2010
- November, 2010
- October, 2010
- September, 2010
- August, 2010
- June, 2010
- May, 2010
- April, 2010
- March, 2010
- February, 2010
- January, 2010
- December, 2009
- November, 2009
- October, 2009
- September, 2009
- August, 2009
- July, 2009
- June, 2009
- May, 2009
- April, 2009
- March, 2009
- February, 2009
- January, 2009
- December, 2008
- November, 2008
- October, 2008
- September, 2008
- August, 2008
- July, 2008
- May, 2008
- April, 2008
- March, 2008
- February, 2008
- January, 2008
- December, 2007
- November, 2007
- October, 2007
- September, 2007
- August, 2007
- July, 2007
- June, 2007
- May, 2007
- April, 2007
- March, 2007
- February, 2007
- January, 2007
- December, 2006
- November, 2006
- October, 2006
- September, 2006
- August, 2006
- July, 2006
- June, 2006
- May, 2006
- April, 2006
- March, 2006
- February, 2006
- January, 2006
- December, 2005
- November, 2005
- October, 2005
- September, 2005
- August, 2005
- June, 2005
- May, 2005
- April, 2005
- March, 2005
- February, 2005
No comment yet
首先,我这里谈的是一个狭义的神经网络概念。至于那些深刻模仿神经结构的系统(如达尔文系列)不在讨论范围之内。
最近很多朋友问起神经网络的事情,也想用神经网络来解决问题。其实,对于神经网络,我一直是抱着一定偏见的,主要是因为其理论基础。首先,神经网络的确可以无限逼近任何函数,主要问题在学习方法和隐层神经元的选择。学习方法上,反向传播算法是一个强大的工具,在很多情况下,的确能够达到全局最优解,但是,反向传播算法中实际是应用的梯度下降法来逼近的,因此不可避免的可能会陷入局部最优解,也就是训练中所谓的过度拟合。求解适合神经网络的过程其实可以看作一个状态空间的搜索问题,因此,大家又用遗传算法、蚁群算法等来搜索全局最优解。但是遗传算法、蚁群算法本身的理论基础也是不扎实的,无法让人信服。
要让神经网络能够搜索到一个比较好的解,需要大量的样本。就人脸检测问题而言,至少要2000个正样本及8000个负样本才能得到较好的结果。而这,就是模型的理论基础不扎实所付出的代价。
关于隐层神经元的作用,可以看成是降维,但是又没有PCA一类的明确数学含义,因此,无法准确得出降维的效果,也就只能用实验的方法来判断隐层神经元的个数。
瑕不掩玉,的确,对于一部分只知道输入和输出的黑箱子问题,又没有明确的数学模型,用神经网络来逼近是现阶段比较实用的方法。我的建议是,如果有明确的数学模型,还是要用明确的数学模型来实现的。比如有关时间序列的,可以用隐马尔可夫模型,关于分类的,可以用支持向量机或者Bayes模型,这些有扎实数学基础的模型一般来说可以用这些数学模型更加明确的指导设计。
Related Posts
- 05 Dec 2023, The Phase-Change in User-Faced Computing
- 28 Mar 2023, The Dangerous Territory of Pre-GPT4
- 08 Nov 2022, Stretch iPhone to its Limit, a 2GiB Model that can Draw Everything in Your Pocket
- 02 Sep 2022, 5 More Implementation Details of PPO and SAC
- 16 Aug 2022, Openness Trumps Profit-Seeking, What We can Learn from a Decade of AI Renaissance
- 28 Jul 2022, Fun with Motion-JPEG, Jupyter Notebook and How to Stream GUI and Animations in the Browser
- 22 Mar 2022, Gifting Another 100x More Computations to the Physical World
- 17 Mar 2022, Think Big and Small
- 14 Dec 2021, Research Notes on Humanoid Robots
- 08 Nov 2021, The Lesson from Private Ski Lesson, or How Money Cannot Buy Performance
blog comments powered by Disqus