Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
forecasting:meeting_minutes_september_21_2020 [2020/09/21 18:43] kmacloves created |
forecasting:meeting_minutes_september_21_2020 [2021/09/19 21:59] (current) |
||
---|---|---|---|
Line 3: | Line 3: | ||
=====Updates===== | =====Updates===== | ||
- | * Started Section 8 of the course: Polynomial Regression | + | * Started section 9 of the course: Support Vector Regression (SVR). |
- | * Topics learned: Polynomial Regression Intuition, implementation in python | + | * Topics learned: SVR and Support Vector Machines. |
=====Problems===== | =====Problems===== | ||
* COVID-19 forced today's lab hours to be online | * COVID-19 forced today's lab hours to be online | ||
Line 11: | Line 10: | ||
=====Reminders===== | =====Reminders===== | ||
- | * **PolynomialFeatures(degree = ).fit_transform** is used to create a matrix where each value of your independent variable, x, is organized into row arrays based on the degree | + | * Support Vector Regression (SVR) uses a //hyper plane// to stratify data and features. |
- | * If you have features [a, b, c] the default polynomial features(in sklearn the degree is 2) should be [1, a, b, c, a^2, b^2, c^2, ab, bc, ca]. | + | * The hyper plane is created with a regression model and is paired with the data points (known as the support vectors) which create a margin. |
- | * If you have features [1] and degree = 2, the matrix should give [1, 1, 1,] | + | * Points within the margins are ignored because (similar to noise) we want to select data that accurately represents the features. |
- | * If you have features [2] and degree = 2, the matrix should give [1, 2, 4,] | + | |
- | * If you have features [3] and degree = 2, the matrix should give [1, 3, 9,] | + |