问HN:fast.ai的《程序员的深度学习》在2025年仍然相关吗?
亲爱的大家,
我在十多年前通过Andrew Ng的Coursera课程学习了一些基础的机器学习(ML),最近我从数学硕士项目毕业,手头有一些空闲时间,因此我考虑重新学习机器学习/深度学习(DL)。
在[Yacine的视频](https://www.youtube.com/watch?v=ph6PIchDOcQ)中,他提到了[fast.ai的课程](https://course.fast.ai/),我之前听说过但没有深入了解过。该书的目录看起来相当扎实,但它是2020年出版的,因此我想问在AI发展如此迅速的情况下,您认为这本书或课程系列对于今天的学习者来说仍然是一个不错的选择吗?
*为了提供更多关于我的背景信息*:我在本科期间主修数学,辅修计算机科学(有Python背景),但从未参加过任何机器学习/深度学习课程(除了那门Coursera课程),我刚刚完成了数学硕士项目,虽然我有背景并且一直对图论、组合数学和理论计算机科学感兴趣。
我有两本书,《动手学机器学习》(作者:Geron)和《动手学大语言模型》(作者:Alammar和Grootendorst),计划在具备足够的基础知识后完成斯坦福大学的CS224N和CS336以及卡内基梅隆大学的深度学习系统课程。我对构建和改进智能系统(如DeepProver和AlphaProof)感兴趣,这些系统可以用于提升数学证明和研究。
非常感谢!
查看原文
Dear all,<p>I learned some basic ML from Andrew Ng's Coursera course more than 10 years ago, recently I graduated from the Math Master program and have some free time in my hand, so I am thinking about picking up ML/DL again.<p>In [Yacine's video](https://www.youtube.com/watch?v=ph6PIchDOcQ), he mentioned [fast.ai's course](https://course.fast.ai/), which I heard of in the past but didn't look into too much. The table of contents of [the book](https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527) looks pretty solid, but it was published in 2020, so I was wondering given the pace of AI development, do you think this book or course series is still a good choice and relevant for today's learners?<p>*To provide more context about me*: I did math major and CS minor (with Python background) during undergrad but have never taken any ML/DL courses (other than that Coursera one), and I just finished the Master program in math, though I have background and always have interests in graph theory, combinatorics, and theoretical computer science.<p>I have two books "Hands-on Machine Learning" by Geron and "Hands-on LLMs" by Alammar and Grootendorst, and plan to finish Stanford's CS224N and CS336 and CMU's DL systems when I have enough background knowledges. I am interested in building and improving intelligent systems such as DeepProver and AlphaProof that can be used to improve math proof/research.<p>Thank you a lot!