Kaggle|Courses|XGBoost[待補(bǔ)充]

In this tutorial, you will learn how to build and optimize models with gradient boosting. This method dominates many Kaggle competitions and achieves state-of-the-art results on a variety of datasets.

Introduction

For much of this course, you have made predictions with the random forest method, which achieves better performance than a single decision tree simply by averaging the predictions of many decision trees.

We refer to the random forest method as an "ensemble method". By definition, ensemble methods combine the predictions of several models (e.g., several trees, in the case of random forests).

Next, we'll learn about another ensemble method called gradient boosting.

Gradient Boosting

Gradient boosting is a method that goes through cycles to iteratively add models into an ensemble.

It begins by initializing the ensemble with a single model, whose predictions can be pretty naive. (Even if its predictions are wildly inaccurate, subsequent additions to the ensemble will address those errors.)

Then, we start the cycle:

  • First, we use the current ensemble to generate predictions for each observation in the dataset. To make a prediction, we add the predictions from all models in the ensemble.
  • These predictions are used to calculate a loss function (like mean squared error, for instance).
  • Then, we use the loss function to fit a new model that will be added to the ensemble. Specifically, we determine model parameters so that adding this new model to the ensemble will reduce the loss. (Side note: The "gradient" in "gradient boosting" refers to the fact that we'll use gradient descent on the loss function to determine the parameters in this new model.)
  • Finally, we add the new model to ensemble, and ...
  • ... repeat!

Example

We begin by loading the training and validation data in X_train, X_valid, y_train, and y_valid.

Output

Code

In this example, you'll work with the XGBoost library. XGBoost stands for extreme gradient boosting, which is an implementation of gradient boosting with several additional features focused on performance and speed. (Scikit-learn has another version of gradient boosting, but XGBoost has some technical advantages.)

In the next code cell, we import the scikit-learn API for XGBoost (xgboost.XGBRegressor). This allows us to build and fit a model just as we would in scikit-learn. As you'll see in the output, the XGBRegressor class has many tunable parameters -- you'll learn about those soon!

<pre style="box-sizing: border-box; text-rendering: auto; -webkit-font-smoothing: antialiased; overflow: auto; font-family: &quot;Roboto Mono&quot;, Monaco, Consolas, monospace; font-size: 14px; display: block; padding: 0px; margin: 0px; line-height: 1.7; word-break: break-all; overflow-wrap: break-word; color: rgba(0, 0, 0, 0.7); background-color: rgb(247, 247, 247); border: none; border-radius: 2px; white-space: pre-wrap; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;">from xgboost import XGBRegressor

my_model = XGBRegressor()
my_model.fit(X_train, y_train)</pre>

We also make predictions and evaluate the model.

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
  • 序言:七十年代末诱鞠,一起剝皮案震驚了整個(gè)濱河市诸迟,隨后出現(xiàn)的幾起案子内边,更是在濱河造成了極大的恐慌楔壤,老刑警劉巖,帶你破解...
    沈念sama閱讀 211,290評(píng)論 6 491
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件巷查,死亡現(xiàn)場(chǎng)離奇詭異疯坤,居然都是意外死亡编曼,警方通過(guò)查閱死者的電腦和手機(jī)僧凤,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 90,107評(píng)論 2 385
  • 文/潘曉璐 我一進(jìn)店門(mén),熙熙樓的掌柜王于貴愁眉苦臉地迎上來(lái)元扔,“玉大人躯保,你說(shuō)我怎么就攤上這事∨煊铮” “怎么了途事?”我有些...
    開(kāi)封第一講書(shū)人閱讀 156,872評(píng)論 0 347
  • 文/不壞的土叔 我叫張陵,是天一觀的道長(zhǎng)擅羞。 經(jīng)常有香客問(wèn)我尸变,道長(zhǎng),這世上最難降的妖魔是什么减俏? 我笑而不...
    開(kāi)封第一講書(shū)人閱讀 56,415評(píng)論 1 283
  • 正文 為了忘掉前任召烂,我火速辦了婚禮,結(jié)果婚禮上娃承,老公的妹妹穿的比我還像新娘奏夫。我一直安慰自己,他們只是感情好历筝,可當(dāng)我...
    茶點(diǎn)故事閱讀 65,453評(píng)論 6 385
  • 文/花漫 我一把揭開(kāi)白布酗昼。 她就那樣靜靜地躺著,像睡著了一般梳猪。 火紅的嫁衣襯著肌膚如雪麻削。 梳的紋絲不亂的頭發(fā)上,一...
    開(kāi)封第一講書(shū)人閱讀 49,784評(píng)論 1 290
  • 那天春弥,我揣著相機(jī)與錄音呛哟,去河邊找鬼。 笑死惕稻,一個(gè)胖子當(dāng)著我的面吹牛竖共,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播俺祠,決...
    沈念sama閱讀 38,927評(píng)論 3 406
  • 文/蒼蘭香墨 我猛地睜開(kāi)眼公给,長(zhǎng)吁一口氣:“原來(lái)是場(chǎng)噩夢(mèng)啊……” “哼借帘!你這毒婦竟也來(lái)了?” 一聲冷哼從身側(cè)響起淌铐,我...
    開(kāi)封第一講書(shū)人閱讀 37,691評(píng)論 0 266
  • 序言:老撾萬(wàn)榮一對(duì)情侶失蹤肺然,失蹤者是張志新(化名)和其女友劉穎,沒(méi)想到半個(gè)月后腿准,有當(dāng)?shù)厝嗽跇?shù)林里發(fā)現(xiàn)了一具尸體际起,經(jīng)...
    沈念sama閱讀 44,137評(píng)論 1 303
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡,尸身上長(zhǎng)有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 36,472評(píng)論 2 326
  • 正文 我和宋清朗相戀三年吐葱,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了街望。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點(diǎn)故事閱讀 38,622評(píng)論 1 340
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出胯盯,到底是詐尸還是另有隱情,我是刑警寧澤哎甲,帶...
    沈念sama閱讀 34,289評(píng)論 4 329
  • 正文 年R本政府宣布,位于F島的核電站饲嗽,受9級(jí)特大地震影響炭玫,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜貌虾,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 39,887評(píng)論 3 312
  • 文/蒙蒙 一吞加、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧尽狠,春花似錦榴鼎、人聲如沸。這莊子的主人今日做“春日...
    開(kāi)封第一講書(shū)人閱讀 30,741評(píng)論 0 21
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽(yáng)。三九已至哩陕,卻和暖如春平项,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背悍及。 一陣腳步聲響...
    開(kāi)封第一講書(shū)人閱讀 31,977評(píng)論 1 265
  • 我被黑心中介騙來(lái)泰國(guó)打工闽瓢, 沒(méi)想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人心赶。 一個(gè)月前我還...
    沈念sama閱讀 46,316評(píng)論 2 360
  • 正文 我出身青樓扣讼,卻偏偏與公主長(zhǎng)得像,于是被迫代替她去往敵國(guó)和親缨叫。 傳聞我的和親對(duì)象是個(gè)殘疾皇子椭符,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 43,490評(píng)論 2 348

推薦閱讀更多精彩內(nèi)容