脑科学相关神经网络论文集
12 Feb 2024本系每年开一门一学期的《理论生物物理专题》课程,今年的主讲人是脑科学方向的,所以选择的论文基本全是神经网络相关的。
从博二跟老板夸下海口略懂人工智能,到参加暑期学校,再到现在做博后了,原来的项目还是没有很大进展。之前和合作者普遍认为瓶颈在于训练数据的质和量,但从我自己的角度看,我在想是不是应该把步子放小一点,先在简单项目里做出点成品再说。这样再出问题的话,至少可以确认不是自己的编程技术方面的短板。
多层级神经通路启发了深度卷积神经网络 - 2012年
Hierarchical neural circuits motivate deep convolutional neural networks (CNNs) 2012
- Rumelhart DE, Hinton GE, Williams RJ (1986). Learning representations by back-propagating errors. Nature 323: 533–536. [论文][笔记]
- LeCun Y, Boser B, Denker JS, Henderson D, Howard RE, Hubbard W, Jackel LD (1989). Backpropagation applied to handwritten zip code recognition. Neural Comput 1: 541–551. [论文][笔记]
- Krizhevsky A, Sutskever I, Hinton GE (2012). ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 25:1097–105. [论文][笔记]
针对任务优化的深度卷积神经网络预言了大脑中神经反应的诸多方面 - 2016年
Task-optimized deep CNNs predict aspects of neural responses in brains 2014
- Yamins DLK, Hong H, Cadieu CF, Solomon EA, Seibert D, DiCarlo JJ (2014). Performance-optimized hierarchical models predict neural responses in higher visual cortex. Proc Natl Acad Sci USA 111: 8619–8624. [论文][笔记]
- Khaligh-Razavi SM, Kriegeskorte N (2014). Deep supervised, but not unsupervised, models may explain IT cortical representation. PLoS Comput Biol 10: e1003915. [论文][笔记]
深度神经网络和大脑的单元活动之可视化
Visualizing unit activity in deep CNNs and brains
- Zeiler MD, Fergus R (2014). Visualizing and Understanding Convolutional Networks. arxiv.org/abs/1311.2901. [论文][笔记]
- Bashivan P, Kar K, DiCarlo JJ (2019). Neural population control via deep image synthesis. Science 364, eaav9436. [论文][笔记]
根据深度卷积神经网络推测神经通路的工作机制
Inferring mechanisms of neural circuit computation from deep CNNs
- McIntosh L, Maheswaranathan N, Nayebi A, Ganguli S, Baccus S (2016). Deep Learning Models of the Retinal Response to Natural Scenes. Advances in Neural Inf Processing Systems 29:1369–1377. [论文][笔记]
- Lindsey J, Ocko SA, Ganguli S, Deny S (2019). A Unified Theory Of Early Visual Representations From Retina To Cortex Through Anatomically Constrained Deep CNNs. https://doi.org/10.1101/511535. [论文][笔记]
循环神经网络是动力学性的
Recurrent networks are dynamic
- Kar K, Kubilius J, Schmidt K, Issa EB, DiCarlo JJ (2019). Evidence that recurrent circuits are critical to the ventral stream’s execution of core object recognition behavior. Nature neuroscience, 22(6), 974. [论文][笔记]
- Kietzmann TC, Spoerer CJ, Sörensen LKA, Cichy RM, Hauk O, Kriegeskorte N (2019). Recurrence is required to capture the representational dynamics of the human visual system. PNAS 116: 21854-21863. [论文][笔记]
强化学习之探索 - 2016年
Reinforcement learning explores 2016
- Cross L, Cockburn J, Yue Y, O’Doherty JP (2021). Using deep reinforcement learning to reveal how the brain encodes abstract state-space representations in high-dimensional environments. Neuron 109: 724-738. [论文][笔记]
- Wang JX, Kurth-Nelson Z, Kumaran D, Tirumala D, Soyer H, Leibo JZ, et al (2018). Prefrontal cortex as a meta-reinforcement learning system. Nat. Neurosci. 21: 860–868. [论文][笔记]
生物系统有能力进行无监督学习
Unsupervised learning is biologically plausible
- Lotter W, Kreiman G, Cox D (2017). Deep Predictive Coding Networks for Video Prediction and Unsupervised Learning. https://arxiv.org/abs/1605.08104. [论文][笔记]
- Zhuang C, Yan S, Nayebi A, Schrimpf M, Frank MC, DiCarlo JJ, Yamins DLK (2021). Unsupervised neural network models of the ventral visual stream. Proc of the National Academy of Sciences 118: e2014196118. [论文][笔记]