In this talk I will give an introduction to modern techniques in learning theory. I will discuss what makes a Boolean function easy or hard to learn (in general, stable functions are easy to learn, while sensitive functions are "morally" hard to learn). I will describe my view of the 3.5 coolest techniques used today in learning Boolean functions, which are: 1. the low-degree algorithm; 2. the sparse algorithm; 3. boosting; 3.5. using influences to learn monotone functions. The lecture is intended as an introduction to doing research in the field. Most learning results I will describe are based on the discrete Fourier transform (i.e. writing a function as a sum of parities). I will start the talk by defining and speaking about the discrete Fourier transform. Some of the goal of this talk is to give background to Zhiqiang Zhang's talk on Friday. No background in learning theory is needed to understand this talk. I encourage everyone to come, since this is an interesting topics, and if there is enough interest, we can create a working group on these topics.