Lead Teaching Assistant — Deep Learning (Spring 2019)
This course was offered for the first time in Spring 2019 by Professors Keith M. Chugg and Brandon Franzke (
see syllabus). As lead TA, my duties were:
- Teaching lectures on techniques in deep learning (slides)
- Designing handouts on Basic Operations of Neural Networks and Matrix Calculus
- Mentoring several groups of students for their final projects:
- Explaining software tools such as deep learning frameworks in Python and Amazon Web Services
- Designing assignments
- Conducting discussion sessions and holding office hours to clarify tricky concepts
More Teaching and Mentoring Experiences
- Teaching Assistant — MOS VLSI Circuit Design (Fall 2016 – Spring 2018): Duties similar to Deep Learning.
- Graduate Student Mentor — USC Young Researchers Program (Summer 2016): This is a program for local high school rising seniors from traditionally underrepresented minorities to work on science and engineering projects. I mentored one such student, taught him to code in Python, and we developed a program to play the board game Scrabble. See poster.
- See my CV for more mentorship experiences.
Since May 2018 I am working on a document titled Essential Concepts of Deep Learning. It is a handbook of sorts which explains useful math, concepts, tricks and techniques for deep learning such as probability, linear algebra, optimization, etc. This is compiled from various books, online sources, courses at USC, and of course, my own experiences from working in this field since 2016. At this point I am not entirely sure what the final state of this document will be, but I see several options: a) Keep it as an informal ready reckoner on deep learning, b) Use it as a handbook for teaching, c) Add my own research to it and publish as a full-fledged book. Right now, it exists as the first option in the form of an Overleaf project. Any suggestions for improvement and prospective collaboration is appreciated.
Since May 2018 I am working on a document titled Essential Concepts of Deep Learning. It is a handbook of sorts which explains useful math, concepts, tricks and techniques for deep learning such as probability, linear algebra, optimization, etc. This is compiled from various books, online sources, courses at USC, and of course, my own experiences from working in this field since 2016. At this point I am not entirely sure what the final state of this document will be, but I see several options: a) Keep it as an informal ready reckoner on deep learning, b) Use it as a handbook for teaching, c) Add my own research to it and publish as a full-fledged book. Right now, it exists as the first option in the form of an Overleaf project. Any suggestions for improvement and prospective collaboration is appreciated.