# Welcome to $\mathcal{X}s hidden library πŸ“£ **Read my latest blogs:** * [[Cracking the Annotated Transformer - Part I]] * [[Cracking the Annotated Transformer - Part II]] ## What is this library? Thanks for visiting my library - my name is [[About|Huang]]. I build this library purely out of personal interest on various topics, which I found that might be helpful for who share common interest of exploring the frontier of AI and its security landscape. In this library, you will find a lot of themes on machine learning, both classic and state-of-art algorithms, adversarial machine learning and other related AI security topics, as well as my journal of technical writing, readings, learning and reflection. ## Explore the topics Here are the topics you might find interesting to read: - [[Introduction|πŸ“š The book of Adversarial Machine Learning]] - [[The library of ML theory|πŸ€– Learn machine learning theory]] - [[Statistical learning theory|πŸ“ˆ Introduction to statistics]] - [[My blogs|πŸ“ Read my blogs]] ## What is this not? This is not meant to be any tutorial on (adversarial) machine learning or statistics, you may find some other text books more comprehensive if you are totally new to this area. I have a list of recommendations, although this may change over time, the books below are absolutely top-notch. And this is also not a wikipedia for machine learning, although I try to maintain the content by backlinks (thanks Obsidian) which allows traveling across different knowledge much quicker. Last but not least, the opinions here are solely on my own, and do not represent any of the affiliation that I am or was associated with. ## Recommended books - Hastie, T., Tibshirani, R., & Friedman, J. H. (2009). _The elements of statistical learning: data mining, inference, and prediction_, 2nd ed, New York, NY: Springer. > *Perhaps every statistician has read about it. Absolutely classic, no doubt. If you have CS background, you should read it in addition to PRML, since this will give you a solid foundation in terms of statistical learning theory. And this is what I did.* - Bishop, C.M., 2006. *Pattern recognition and machine learning*, Information science and statistics. Springer, New York. > I consider it as the bible of machine learning, although it looks detached from what we learn nowadays, who needs to write a kernel machine for a classification problem after all? However, if you are doing AI research, you should understand every sentence in this book. Bishop has published a new book for deep learning, I haven't read, but definitely give it a go. - Wasserman, L., 2010. *All of statistics: a concise course in statistical inference*, Springer texts in statistics. Springer, New York Berlin Heidelberg. > Strictly speaking this is not a textbook for statistics, however it is sufficient for those from CS to prepare the knowledge of statistics in machine learning. - Goodfellow, I., Bengio, Y., Courville, A., 2016. *Deep learning, Adaptive computation and machine learning*. The MIT Press, Cambridge, Massachusetts. > I recommend this book after you finished either ESL or PRML, as this will be focus on deep learning, has lots of things that are not covered by ESL or PRML. If you look at the book now, there're again new things won't be covered, as this area is pacing simply too fast. - Zhang, A., Lipton, Z.C., Li, M., Smola, A.J., 2023. *Dive into deep learning*. Cambridge University Press, Cambridge, United Kingdomβ€―; New York, NY. Check out [online version](https://d2l.ai/) > A relatively new choice with online version, which is better with notebooks, code samples and illustrations. Online publishing makes much more sense nowadays for this area, it allows much faster iteration of updating new content, releasing new algorithm and so on.