567彩票

您所在位置網站首頁 > 海量文檔  > 等級考試 > 研究生考試

作業4解答機器學習課程.pdf 12頁

本文檔一共被下載: ,您可全文免費在線閱讀后下載本文檔。

  • 百度一下本文檔

下載提示

1.本站不保證該用戶上傳的文檔完整性,不預覽、不比對內容而直接下載產生的反悔問題本站不予受理。
2.該文檔所得收入(下載+內容+預覽三)歸上傳者、原創者。
3.登錄后可充值,立即自動返金幣,充值渠道很便利
CS229 Problem Set #4 Solutions 1 CS 229, Public Course Problem Set #4 Solutions: Unsupervised Learn- ing and Reinforcement Learning 1. EM for supervised learning In class we applied EM to the unsupervised learning setting. In particular, we represented p (x) by marginalizing over a latent random variable p (x) = p (x, z) = p (x |z)p (z). z z However, EM can also be applied to the supervised learning setting, and in this problem we discuss a “mixture of linear regressors” model; this is an instance of what is often call the Hierarchical Mixture of Experts model. We want to represent p (y |x), x ∈ Rn and y ∈ R, and we do so by again introducing a discrete latent random variable p (y |x) = p (y, z |x) = p (y |x, z)p (z |x). z z For simplicity we’ll assume that z is binary valued, that p (y |x, z) is a Gaussian density, and that p (z |x) is given by a logistic regression model. More formally p (z |x; φ) = g (φT x)z (1 ? g (φT x))1?z 1 ?(y ? θT x)2 i √ p (y |x, z = i; θ ) = exp i = 1, 2 i 2πσ 2σ2 where σ is a known parameter and φ, θ , θ ∈ Rn are parameters of the model (here we 0 1

發表評論

請自覺遵守互聯網相關的政策法規,嚴禁發布色情、暴力、反動的言論。
用戶名: 驗證碼: 點擊我更換圖片

“原創力文檔”前稱為“567彩票”,本站為“文檔C2C交易模式”,即用戶上傳的文檔直接賣給(下載)用戶,本站只是中間服務平臺,本站所有文檔下載所得的收益歸上傳人(含作者)所有【成交的100%(原創)】。原創力文檔是網絡服務平臺方,若您的權利被侵害,侵權客服QQ:3005833200 電話:19940600175 歡迎舉報,上傳者QQ群:784321556

707彩票-707彩票平台-707彩票官网 完美彩票-完美彩票平台-完美彩票官网 709彩票-709彩票网站-709彩票App 聚福彩票-聚福彩票注册-聚福彩票网址 辉煌彩票-辉煌彩票投注-辉煌彩票注册 新彩网-新彩网注册-新彩网网址