interviews.ai/README.md

182 lines
10 KiB
Markdown
Raw Normal View History

2021-10-19 16:04:31 +02:00
2021-10-20 09:33:10 +02:00
<h1 align="center">Deep Learning Interviews book: Hundreds of fully solved job interview questions from a wide range of key topics in AI.</h1>
2021-10-19 16:04:31 +02:00
<p align="center">
2021-10-20 10:04:04 +02:00
<a href="#download">Download PDF</a>
2022-01-06 08:01:55 +01:00
<a href="#about">About</a>
2022-01-11 07:14:00 +01:00
<a href="#about">Errata</a>
2021-10-19 16:04:31 +02:00
</p>
<h1 align="center">
2021-10-26 19:20:33 +02:00
<img src="https://github.com/BoltzmannEntropy/interviews.ai/blob/main/assets/cover-amazon-print2.png" width="100%"></a>
2021-10-19 16:04:31 +02:00
</h1>
2022-01-06 20:50:07 +01:00
# A PERSONAL NOTE:
2022-01-21 21:02:37 +01:00
"Keep learning, or risk becoming irrelevant."
2022-01-06 20:50:07 +01:00
In this first volume, I purposely present a coherent, cumulative, and content-specific core curriculum of the data science field, including topics
such as information theory, Bayesian statistics, algorithmic differentiation, logistic regression, perceptrons, and convolutional neural networks.
2022-01-06 20:50:38 +01:00
I hope you will find this book stimulating.
It is my belief that you the postgraduate students and job-seekers for whom the book is primarily meant will benefit from
2022-01-06 20:50:07 +01:00
reading it; however, it is my hope that even the most experienced researchers will find it fascinating as well.
2022-01-12 16:41:24 +01:00
## **I would like to solicit corrections, criticisms, and suggestions from students and other readers. Although I have tried to eliminate errors over the multi year process of writing and revising this text, a few undoubtedly remain. In particular, some typographical infelicities will no doubt find their way into the final version.** **I hope you will forgive them.**
2022-01-13 16:43:29 +01:00
**Contact Amir:**
2022-01-10 21:47:05 +01:00
* https://www.linkedin.com/in/amirivry/
* https://scholar.google.com.mx/citations?user=rQCVwksAAAAJ&hl=iw
2022-01-13 16:43:29 +01:00
**Contact Shlomo:**
2022-01-10 19:08:26 +01:00
2022-01-10 19:44:46 +01:00
* https://www.linkedin.com/in/quantscientist/
2022-01-10 19:23:55 +01:00
2022-01-13 16:21:36 +01:00
* https://scholar.google.com.mx/citations?user=bM0LGgcAAAAJ&hl
2022-01-10 19:43:15 +01:00
2022-01-18 08:37:22 +01:00
This book is available for purchase through Amazon and other standard distribution channels. Please see the publisher's web page to order the book or to obtain further details on its publication. A manuscript of the book can be found below—it has been made available for personal use only and must not be sold.
2022-01-10 19:44:46 +01:00
* https://amazon.com/author/quantscientist
2022-01-10 19:23:35 +01:00
2022-01-10 21:08:05 +01:00
2022-01-10 19:43:15 +01:00
2021-10-19 16:04:31 +02:00
---
2021-10-20 12:13:18 +02:00
# Download
2021-10-19 16:04:31 +02:00
2021-10-20 10:04:04 +02:00
### The PDF is available here:
2022-01-04 18:59:05 +01:00
https://arxiv.org/abs/2201.00650
2022-01-04 19:05:04 +01:00
## Citation
2022-01-04 19:02:32 +01:00
```
2022-01-05 14:15:40 +01:00
@misc{kashani2021deep,
2022-01-04 18:59:05 +01:00
title={Deep Learning Interviews: Hundreds of fully solved job interview questions from a wide range of key topics in AI},
author={Shlomo Kashani and Amir Ivry},
year={2021},
eprint={2201.00650},
2022-01-04 19:05:04 +01:00
note = {ISBN 13: 978-1-9162435-4-5 },
url = {https://www.interviews.ai},
2022-01-04 18:59:05 +01:00
archivePrefix={arXiv},
primaryClass={cs.LG}
}
2022-01-04 19:02:32 +01:00
```
2022-01-06 08:01:55 +01:00
**SELLING OR COMMERCIAL USE IS STRICTLY PROHIBITED**.
The user rights of this e-resource are specified in a licence agreement below.
You may only use this e-resource for the purposes *private study*.
Any selling/reselling of its content is strictly prohibited.
2021-10-19 16:57:48 +02:00
This book (www.interviews.ai) was written for you: an aspiring data scientist with a quantitative background, facing down the gauntlet of the interview process in an increasingly competitive field. For most of you, the interview process is the most significant hurdle between you and a dream job.
2021-10-19 16:57:00 +02:00
Even though you have the ability, the background, and the motivation to excel in your target position, you might need some guidance on how to get your foot in the door.
2021-10-19 16:54:05 +02:00
2021-10-20 09:30:34 +02:00
2021-10-19 16:04:31 +02:00
## About
2021-10-26 19:15:53 +02:00
The second edition of Deep Learning Interviews (The Amazon Softcover is printed in B&W) is home to hundreds of fully-solved problems, from a wide range of key topics in AI. It is designed to both rehearse interview or exam specific topics and provide machine learning M.Sc./Ph.D. students, and those awaiting an interview a well-organized overview of the field. The problems it poses are tough enough to cut your teeth on and to dramatically improve your skills-but theyre framed within thought-provoking questions and engaging stories.
That is what makes the volume so specifically valuable to students and job seekers: it provides them with the ability to speak confidently and quickly on any relevant topic, to answer technical questions clearly and correctly, and to fully understand the purpose and meaning of interview questions and answers. Those are powerful, indispensable advantages to have when walking into the interview room.
The books contents is a large inventory of numerous topics relevant to DL job interviews and graduate level exams. That places this work at the forefront of the growing trend in science to teach a core set of practical mathematical and computational skills. It is widely accepted that the training of every computer scientist must include the fundamental theorems of ML, and AI appears in the curriculum of nearly every university. This volume is designed as an excellent reference for graduates of such programs.
- The book spans almost 400 pages
- Hundreds of fully-solved problems
- Problems from numerous areas of deep learning
- Clear diagrams and illustrations
- A comprehensive index
- Step-by-step solutions to problems
- Not just the answers given, but the work shown
- Not just the work shown, but reasoning given where appropriate
This book was written for you: an aspiring data scientist with a quantitative background, facing down the gauntlet of the interview process in an increasingly competitive field. For most of you, the interview process is the most significant hurdle between you and a dream job. Even though you have the ability, the background, and the motivation to excel in your target position, you might need some guidance on how to get your foot in the door.
Your curiosity will pull you through the books problem sets, formulas, and instructions, and as you progress, youll deepen your understanding of deep learning. There are intricate connections between calculus, logistic regression, entropy, and deep learning theory; work through the book, and those connections will feel intuitive.
## CORE SUBJECT AREAS (VOLUME-I):
VOLUME-I of the book focuses on statistical perspectives and blends background fundamentals with core ideas and practical knowledge. There are dedicated chapters on:
- Information Theory
- Calculus & Algorithmic Differentiation
- Bayesian Deep Learning & Probabilistic Programming
- Logistic Regression
- Ensemble Learning
- Feature Extraction
- Deep Learning: expanded chapter (100+ pages)
These chapters appear alongside numerous in-depth treatments of topics in Deep Learning with code examples in PyTorch, Python and C++.
2021-10-19 16:59:51 +02:00
2021-10-19 16:04:31 +02:00
## Disclaimers
- "PyTorch" is a trademark of Facebook.
## Licensing
- Copyright © [Shlomo Kashani, author of the book "Deep Learning Interviews"](www.interviews.ai)
Shlomo Kashani, Author of the book _Deep Learning Interviews_ www.interviews.ai: entropy@interviews.ai
2021-10-19 17:07:58 +02:00
2021-10-20 09:24:48 +02:00
<h1 align="center">
2022-01-11 07:14:41 +01:00
<img src="https://github.com/BoltzmannEntropy/interviews.ai/blob/main/assets/droput2-ans.png" width="50%"></a>
2021-10-20 09:24:48 +02:00
</h1>
2022-01-11 07:14:00 +01:00
# Errata (May not be up to date)
## ***Minor corrections are not included.***
Thank you to all the readers who pointed out these issues.
**Errata for the version** **03/12/2020** **printing and reflected in the online version:**
1. Question number **PRB-267 -CH.PRB- 8.91** was removed due to lack of clarity
2. Question number **PRB-115 - CH.PRB- 5.16** was removed due to lack of clarity
**Errata for the version** **05/12/2020** **printing and reflected in the online version:**
1. Page 230, Question number **PRB-178** amend “startified scross validation“ TO “stratified cross validation.“
2. Page 231, Question number **PRB-181** added a ” .“ after data-folds
3. Page 231, Question number **PRB-191** amend “an” to “a”
4. Page 234, Question number **PRB-192** “in” repeated twice
5. Page 236, Question number **PRB-194 amend** “approached” to “approaches“, “arr” to “arr001”
6. Page 247, Question number **PRB-210 amend** “an” to “a”
7. Page 258, Question number **PRB-227 amend** “A confusion metrics” to “A confusion matrix”
8. Page 271, Question number **PRB-240 amend** “MaxPool2D(4,4,)” to “MaxPool2D(4,4)”
9. Page 273, Question number **PRB-243 amend** “identity” to “identify”
10. Page 281, Question number **PRB-254 amend** “suggest” to “suggests”
11. Page 283, Question number **PRB-256** “happening” misspelled
12. Page 286, “L1, L2” amended to “Norms”
13. Page 288, Question number **SOL-184** **amend** “the full” to “is the full”
14. Page 298, Question number **SOL-208** **amend** “ou1” to “out”
15. Page 319, Question number **SOL-240** **amend** “torch.Size([1, 32, 222, 222]).” to “torch.size([1, 32, 222, 222]).“
16. Page 283, Question number **PRB-256** “happening” was misspelled
**Errata for the version** **07/12/2020** **printing and reflected in the online version:**
1. Page 187, Question number **PRB-140** two missing plots (6.3, 6.4) which did not render correctly on the print version
![ball001.png](https://images.squarespace-cdn.com/content/v1/5c33c435f93fd4233f157b43/1607530038262-PGU5F0YDMFA9NON3NKZA/ball001.png?format=500w)
6.3
![ball002.png](https://images.squarespace-cdn.com/content/v1/5c33c435f93fd4233f157b43/1607530124438-1U0OIE7QPO0DSMP8LKBS/ball002.png?format=500w)
6.4
**Errata for the version** **09/21/2020** **printing and reflected in the online version:**
1. Page 34, Solution number **SOL-19** , 0.21886 should be 0.21305 and 0.21886 ± 1.95 × 0.21886 should be **0.21305** ± 1.95 × 0.21886
2. Page 36-7, Solution number **SOL-21** ,4.8792/0.0258 = **189.116** and not 57.3 and pi(33) = 0.01748 and not pi(33) = **0.211868**.
3. Page 49, **PRB-47** “What is the probability that the expert is a **monkey**“ should be “What is the probability that the expert is a **human**
**Errata for the version** **09/22/2020** **printing and reflected in the online version:**
1. Page 73, Solution number **SOL-56** should read ”The Hessian is generated by **differentiating**
2. Page 57, Problem number **PRB-65** should read ”**two** neurons”
**Errata for the version** **09/24/2020** **printing and reflected in the online version:**
1. Page 78, Solution number **SOL-64** , the OnOffLayer is off only if at least 150 out of 200 neurons are off. Therefore, this may be represented as a Binomial distribution and the probability for the layer to be off is :
![2020-12-24 21_08_52-E__Sync_branded_interviews.ai_amazon_21-12-2020_chap_bayes.tex - TeXstudio.png](https://images.squarespace-cdn.com/content/v1/5c33c435f93fd4233f157b43/1608836963860-U8ZD3L5L4IL5QKOAUZD8/2020-12-24+21_08_52-E__Sync_branded_interviews.ai_amazon_21-12-2020_chap_bayes.tex+-+TeXstudio.png?format=750w)