Machine learning is not magic. The quality of the predictions coming out of your model is a direct reflection of the data you feed it during training. This course with instructor Matt Harrison guides you through the nuances of feature engineering techniques for numeric data so you can take a dataset, tease out the signal, and throw out the noise in order to optimize your machine learning model. Matt teaches you techniques like imputation, binning, log transformations, and scaling for numeric data. He covers methods for other types of data, like as one hot encoding, mean targeting coding, principal component analysis, feature aggregation, and text processing techniques like TFIDF and embeddings. The tools you learn in this course will generalize to nearly any kind of machine learning algorithm/problem, so join Matt in this course to learn how you can extract the maximum value from your data using feature engineering.
Learn More- Students
- Undergraduate
- Graduate
- By College
- College of Arts Humanities, and Social Sciences
- Daniels College of Business
- Daniel Felix Ritchie School of Engineering and Computer Science
- Graduate School of Professional Psychology
- Graduate School of Social Work
- Josef Korbel School of International Studies – Graduate Students
- Josef Korbel School of International Studies – Undergraduate Students
- Morgridge College of Education
- College of Natural Sciences and Mathematics
- University College
- Still Exploring
- Identity / Affinity
- Build Career Skills
- Share Your Story
- Alumni
- Employers & Recruiters
- Student Employment
- About