org.apache.spark.ml.feature包中包含了4種不同的歸一化方法:
- Normalizer
- StandardScaler
- MinMaxScaler
- MaxAbsScaler
有時(shí)感覺會(huì)容易混淆乎莉,借助官方文檔和實(shí)際數(shù)據(jù)的變換民晒,在這里做一次總結(jié)奶稠。
原文地址:http://www.neilron.xyz/spark-ml-feature-scaler/ 轉(zhuǎn)載請(qǐng)注明。
0 數(shù)據(jù)準(zhǔn)備
import org.apache.spark.ml.linalg.Vectors
val dataFrame = spark.createDataFrame(Seq(
(0, Vectors.dense(1.0, 0.5, -1.0)),
(1, Vectors.dense(2.0, 1.0, 1.0)),
(2, Vectors.dense(4.0, 10.0, 2.0))
)).toDF("id", "features")
dataFrame.show
// 原始數(shù)據(jù)
+---+--------------+
| id| features|
+---+--------------+
| 0|[1.0,0.5,-1.0]|
| 1| [2.0,1.0,1.0]|
| 2|[4.0,10.0,2.0]|
+---+--------------+
1 Normalizer
Normalizer的作用范圍是每一行,使每一個(gè)行向量的范數(shù)變換為一個(gè)單位范數(shù)耸棒,下面的示例代碼都來(lái)自spark官方文檔加上少量改寫和注釋。
import org.apache.spark.ml.feature.Normalizer
// 正則化每個(gè)向量到1階范數(shù)
val normalizer = new Normalizer()
.setInputCol("features")
.setOutputCol("normFeatures")
.setP(1.0)
val l1NormData = normalizer.transform(dataFrame)
println("Normalized using L^1 norm")
l1NormData.show()
// 將每一行的規(guī)整為1階范數(shù)為1的向量,1階范數(shù)即所有值絕對(duì)值之和婚温。
+---+--------------+------------------+
| id| features| normFeatures|
+---+--------------+------------------+
| 0|[1.0,0.5,-1.0]| [0.4,0.2,-0.4]|
| 1| [2.0,1.0,1.0]| [0.5,0.25,0.25]|
| 2|[4.0,10.0,2.0]|[0.25,0.625,0.125]|
+---+--------------+------------------+
// 正則化每個(gè)向量到無(wú)窮階范數(shù)
val lInfNormData = normalizer.transform(dataFrame, normalizer.p -> Double.PositiveInfinity)
println("Normalized using L^inf norm")
lInfNormData.show()
// 向量的無(wú)窮階范數(shù)即向量中所有值中的最大值
+---+--------------+--------------+
| id| features| normFeatures|
+---+--------------+--------------+
| 0|[1.0,0.5,-1.0]|[1.0,0.5,-1.0]|
| 1| [2.0,1.0,1.0]| [1.0,0.5,0.5]|
| 2|[4.0,10.0,2.0]| [0.4,1.0,0.2]|
+---+--------------+--------------+
2 StandardScaler
StandardScaler處理的對(duì)象是每一列,也就是每一維特征媳否,將特征標(biāo)準(zhǔn)化為單位標(biāo)準(zhǔn)差或是0均值栅螟,或是0均值單位標(biāo)準(zhǔn)差。
主要有兩個(gè)參數(shù)可以設(shè)置:
- withStd: 默認(rèn)為真篱竭。將數(shù)據(jù)標(biāo)準(zhǔn)化到單位標(biāo)準(zhǔn)差力图。
- withMean: 默認(rèn)為假。是否變換為0均值掺逼。
StandardScaler需要fit數(shù)據(jù)吃媒,獲取每一維的均值和標(biāo)準(zhǔn)差,來(lái)縮放每一維特征吕喘。
import org.apache.spark.ml.feature.StandardScaler
val scaler = new StandardScaler()
.setInputCol("features")
.setOutputCol("scaledFeatures")
.setWithStd(true)
.setWithMean(false)
// Compute summary statistics by fitting the StandardScaler.
val scalerModel = scaler.fit(dataFrame)
// Normalize each feature to have unit standard deviation.
val scaledData = scalerModel.transform(dataFrame)
scaledData.show
// 將每一列的標(biāo)準(zhǔn)差縮放到1赘那。
+---+--------------+------------------------------------------------------------+
|id |features |scaledFeatures |
+---+--------------+------------------------------------------------------------+
|0 |[1.0,0.5,-1.0]|[0.6546536707079772,0.09352195295828244,-0.6546536707079771]|
|1 |[2.0,1.0,1.0] |[1.3093073414159544,0.1870439059165649,0.6546536707079771] |
|2 |[4.0,10.0,2.0]|[2.618614682831909,1.870439059165649,1.3093073414159542] |
+---+--------------+------------------------------------------------------------+
3 MinMaxScaler
MinMaxScaler作用同樣是每一列,即每一維特征氯质。將每一維特征線性地映射到指定的區(qū)間募舟,通常是[0, 1]。
它也有兩個(gè)參數(shù)可以設(shè)置:
- min: 默認(rèn)為0闻察。指定區(qū)間的下限拱礁。
- max: 默認(rèn)為1。指定區(qū)間的上限辕漂。
import org.apache.spark.ml.feature.MinMaxScaler
val scaler = new MinMaxScaler()
.setInputCol("features")
.setOutputCol("scaledFeatures")
// Compute summary statistics and generate MinMaxScalerModel
val scalerModel = scaler.fit(dataFrame)
// rescale each feature to range [min, max].
val scaledData = scalerModel.transform(dataFrame)
println(s"Features scaled to range: [${scaler.getMin}, ${scaler.getMax}]")
scaledData.select("features", "scaledFeatures").show
// 每維特征線性地映射呢灶,最小值映射到0,最大值映射到1钉嘹。
+--------------+-----------------------------------------------------------+
|features |scaledFeatures |
+--------------+-----------------------------------------------------------+
|[1.0,0.5,-1.0]|[0.0,0.0,0.0] |
|[2.0,1.0,1.0] |[0.3333333333333333,0.05263157894736842,0.6666666666666666]|
|[4.0,10.0,2.0]|[1.0,1.0,1.0] |
+--------------+-----------------------------------------------------------+
4 MaxAbsScaler
MaxAbsScaler將每一維的特征變換到[-1, 1]閉區(qū)間上鸯乃,通過(guò)除以每一維特征上的最大的絕對(duì)值,它不會(huì)平移整個(gè)分布隧期,也不會(huì)破壞原來(lái)每一個(gè)特征向量的稀疏性飒责。
import org.apache.spark.ml.feature.MaxAbsScaler
val scaler = new MaxAbsScaler()
.setInputCol("features")
.setOutputCol("scaledFeatures")
// Compute summary statistics and generate MaxAbsScalerModel
val scalerModel = scaler.fit(dataFrame)
// rescale each feature to range [-1, 1]
val scaledData = scalerModel.transform(dataFrame)
scaledData.select("features", "scaledFeatures").show()
// 每一維的絕對(duì)值的最大值為[4, 10, 2]
+--------------+----------------+
| features| scaledFeatures|
+--------------+----------------+
|[1.0,0.5,-1.0]|[0.25,0.05,-0.5]|
| [2.0,1.0,1.0]| [0.5,0.1,0.5]|
|[4.0,10.0,2.0]| [1.0,1.0,1.0]|
+--------------+----------------+
總結(jié)
所有4種歸一化方法都是線性的變換赘娄,當(dāng)某一維特征上具有非線性的分布時(shí),還需要配合其它的特征預(yù)處理方法宏蛉。