1.首先創(chuàng)造數(shù)據(jù)迄靠,數(shù)據(jù)有4列分別是店鋪id咧虎、訂單id卓缰、訂單金額、訂單日期老客。
spark.createDataFrame(Seq(
("1","11",10,"2023-01-01"),
("1","22",20,"2023-01-02"),
("1","33",10,"2023-02-28"),
("1","44",30,"2023-03-02"),
("1","55",10,"2023-05-02"),
("1","55",20,"2023-06-02"),
("1","11",10,"2022-01-01"),
("1","22",20,"2022-01-02"),
("1","33",10,"2022-02-28"),
("1","44",30,"2022-03-02"),
("1","55",10,"2022-05-02"),
("1","55",20,"2022-06-02"),
("11","11",10,"2023-01-01"),
("11","22",30,"2023-01-02"),
("11","33",10,"2023-02-28"),
("11","44",20,"2023-03-02"),
("11","55",10,"2023-05-02"),
("11","55",30,"2023-06-02"),
("11","11",10,"2022-01-01"),
("11","22",20,"2022-01-02"),
("11","33",10,"2022-02-28"),
("11","44",30,"2022-03-02"),
("11","55",20,"2022-05-02"),
("11","55",30,"2022-06-02")
)).toDF("shop_id","order_id","amount","event_day").createOrReplaceTempView("t1")
數(shù)據(jù)如下:
+-------+--------+------+----------+
|shop_id|order_id|amount| event_day|
+-------+--------+------+----------+
| 1| 11| 10|2023-01-01|
| 1| 22| 20|2023-01-02|
| 1| 33| 10|2023-02-28|
| 1| 44| 30|2023-03-02|
| 1| 55| 10|2023-05-02|
| 1| 55| 20|2023-06-02|
| 1| 11| 10|2022-01-01|
| 1| 22| 20|2022-01-02|
| 1| 33| 10|2022-02-28|
| 1| 44| 30|2022-03-02|
| 1| 55| 10|2022-05-02|
| 1| 55| 20|2022-06-02|
| 11| 11| 10|2023-01-01|
| 11| 22| 30|2023-01-02|
| 11| 33| 10|2023-02-28|
| 11| 44| 20|2023-03-02|
| 11| 55| 10|2023-05-02|
| 11| 55| 30|2023-06-02|
| 11| 11| 10|2022-01-01|
| 11| 22| 20|2022-01-02|
+-------+--------+------+----------+
2.計算月銷售額占比
通過窗口函數(shù)實現(xiàn)僚饭,首先聚合月銷售額,之后再根據(jù)月銷售額集合為年銷售額胧砰,最后計算占比即可鳍鸵。
spark.sql(
s"""
|select
|shop_id,
|month,
|year,
|m_amount,
|y_amount,
|round(m_amount/y_amount,4) ratio
|from
|(
|select
|shop_id,
|month,
|m_amount,
|date_format(month,'yyyy') year,
|sum(m_amount) over(partition by date_format(month,'yyyy')) y_amount
|from
|(
|select
|shop_id,
|date_format(event_day,'yyyy-MM') month,
|sum(amount) m_amount
|from t1 group by shop_id,date_format(event_day,'yyyy-MM')
|) a) aa order by shop_id,month
|""".stripMargin).show()
結(jié)果
+-------+-------+----+--------+--------+------+
|shop_id| month|year|m_amount|y_amount| ratio|
+-------+-------+----+--------+--------+------+
| 1|2022-01|2022| 30| 220|0.1364|
| 1|2022-02|2022| 10| 220|0.0455|
| 1|2022-03|2022| 30| 220|0.1364|
| 1|2022-05|2022| 10| 220|0.0455|
| 1|2022-06|2022| 20| 220|0.0909|
| 1|2023-01|2023| 30| 210|0.1429|
| 1|2023-02|2023| 10| 210|0.0476|
| 1|2023-03|2023| 30| 210|0.1429|
| 1|2023-05|2023| 10| 210|0.0476|
| 1|2023-06|2023| 20| 210|0.0952|
| 11|2022-01|2022| 30| 220|0.1364|
| 11|2022-02|2022| 10| 220|0.0455|
| 11|2022-03|2022| 30| 220|0.1364|
| 11|2022-05|2022| 20| 220|0.0909|
| 11|2022-06|2022| 30| 220|0.1364|
| 11|2023-01|2023| 40| 210|0.1905|
| 11|2023-02|2023| 10| 210|0.0476|
| 11|2023-03|2023| 20| 210|0.0952|
| 11|2023-05|2023| 10| 210|0.0476|
| 11|2023-06|2023| 30| 210|0.1429|
+-------+-------+----+--------+--------+------+
3.計算月環(huán)比
月環(huán)比計算公式=(本月-上月)/上月
為了方便說明分幾步進行:
3.1 聚合本月數(shù)據(jù)(為了方便查看進行了order by正常不用的)
spark.sql(
s"""
|select
|shop_id,
|date_format(event_day,'yyyy-MM') event_day,
|sum(amount) amount
|from t1 group by shop_id,date_format(event_day,'yyyy-MM') order by date_format(event_day,'yyyy-MM')
|""".stripMargin).createOrReplaceTempView("t2")
輸出結(jié)果
+-------+---------+------+
|shop_id|event_day|amount|
+-------+---------+------+
| 1| 2022-01| 30|
| 1| 2022-02| 10|
| 1| 2022-03| 30|
| 1| 2022-05| 10|
| 1| 2022-06| 20|
| 1| 2023-01| 30|
| 1| 2023-02| 10|
| 1| 2023-03| 30|
| 1| 2023-05| 10|
| 1| 2023-06| 20|
| 11| 2022-01| 30|
| 11| 2022-02| 10|
| 11| 2022-03| 30|
| 11| 2022-05| 20|
| 11| 2022-06| 30|
| 11| 2023-01| 40|
| 11| 2023-02| 10|
| 11| 2023-03| 20|
| 11| 2023-05| 10|
| 11| 2023-06| 30|
+-------+---------+------+
3.2 聚合上個月的數(shù)據(jù)
因為我們計算月環(huán)比需要上個月的銷售額,例如尉间,2022-02需要2022-01月的銷售額才能計算2022-02的月環(huán)比偿乖,即(2022-02銷售額-2022-01銷售額)/2022-01銷售額,那么怎么獲取2022-01銷售額呢哲嘲?這里采用的方式是join贪薪,所以就需要讓上一個月和本月的日期一樣,就是讓2022-01變成2022-02眠副,然后進行join即可画切。
spark.sql(
s"""
|select
|shop_id,
|date_format(add_months(event_day,1),'yyyy-MM') event_day,
|sum(amount) amount
|from t1 group by shop_id,date_format(add_months(event_day,1),'yyyy-MM')
|order by shop_id,date_format(add_months(event_day,1),'yyyy-MM')
|""".stripMargin).createOrReplaceTempView("t3")
輸出
+-------+---------+------+
|shop_id|event_day|amount|
+-------+---------+------+
| 1| 2022-02| 30|
| 1| 2022-03| 10|
| 1| 2022-04| 30|
| 1| 2022-06| 10|
| 1| 2022-07| 20|
| 1| 2023-02| 30|
| 1| 2023-03| 10|
| 1| 2023-04| 30|
| 1| 2023-06| 10|
| 1| 2023-07| 20|
| 11| 2022-02| 30|
| 11| 2022-03| 10|
| 11| 2022-04| 30|
| 11| 2022-06| 20|
| 11| 2022-07| 30|
| 11| 2023-02| 40|
| 11| 2023-03| 10|
| 11| 2023-04| 20|
| 11| 2023-06| 10|
| 11| 2023-07| 30|
+-------+---------+------+
3.3將本月和上個月的進行join
對于沒有上個月的情況,輸出為null囱怕,代表無意義霍弹,也可以將null轉(zhuǎn)為其他值毫别,依據(jù)具體需求而定。
spark.sql(
s"""
|select
|shop_id,
|a_event_day,
|round((a_amount-b_amount)/b_amount,4) huanbi
|from
|(
|select
|t2.shop_id,
|t2.event_day a_event_day,
|t3.event_day b_event_day,
|t2.amount a_amount,
|t3.amount b_amount
|from t2 left join t3
|on t2.shop_id=t3.shop_id and t2.event_day=t3.event_day
|) order by shop_id,a_event_day
|""".stripMargin).show()
輸出
+-------+-----------+-------+
|shop_id|a_event_day| huanbi|
+-------+-----------+-------+
| 1| 2022-01| null|
| 1| 2022-02|-0.6667|
| 1| 2022-03| 2.0|
| 1| 2022-05| null|
| 1| 2022-06| 1.0|
| 1| 2023-01| null|
| 1| 2023-02|-0.6667|
| 1| 2023-03| 2.0|
| 1| 2023-05| null|
| 1| 2023-06| 1.0|
| 11| 2022-01| null|
| 11| 2022-02|-0.6667|
| 11| 2022-03| 2.0|
| 11| 2022-05| null|
| 11| 2022-06| 0.5|
| 11| 2023-01| null|
| 11| 2023-02| -0.75|
| 11| 2023-03| 1.0|
| 11| 2023-05| null|
| 11| 2023-06| 2.0|
+-------+-----------+-------+
有些人會想到用快窗函數(shù)典格,然后使用lead或者lag獲取上一行或者下一行岛宦,但是這總方式是有問題的,如果月份不連續(xù)耍缴,那么計算的就是錯誤的砾肺。
4 月同比
今年1月和去年1月進行比較。邏輯和環(huán)比一樣只是12個月防嗡,不是1個月变汪。
spark.sql(
s"""
|select
|shop_id,
|date_format(add_months(event_day,12),'yyyy-MM') event_day,
|sum(amount) amount
|from t1 group by shop_id,date_format(add_months(event_day,12),'yyyy-MM')
|order by shop_id,date_format(add_months(event_day,1),'yyyy-MM')
|""".stripMargin).createOrReplaceTempView("t3")
spark.sql(
s"""
|select
|shop_id,
|a_event_day,
|round((a_amount-b_amount)/b_amount,4) tongbi
|from
|(
|select
|t2.shop_id,
|t2.event_day a_event_day,
|t3.event_day b_event_day,
|t2.amount a_amount,
|t3.amount b_amount
|from t2 left join t3
|on t2.shop_id=t3.shop_id and t2.event_day=t3.event_day
|) order by shop_id,a_event_day
|""".stripMargin).show()
輸出
+-------+-----------+-------+
|shop_id|a_event_day| tongbi|
+-------+-----------+-------+
| 1| 2022-01| null|
| 1| 2022-02| null|
| 1| 2022-03| null|
| 1| 2022-05| null|
| 1| 2022-06| null|
| 1| 2023-01| 0.0|
| 1| 2023-02| 0.0|
| 1| 2023-03| 0.0|
| 1| 2023-05| 0.0|
| 1| 2023-06| 0.0|
| 11| 2022-01| null|
| 11| 2022-02| null|
| 11| 2022-03| null|
| 11| 2022-05| null|
| 11| 2022-06| null|
| 11| 2023-01| 0.3333|
| 11| 2023-02| 0.0|
| 11| 2023-03|-0.3333|
| 11| 2023-05| -0.5|
| 11| 2023-06| 0.0|
+-------+-----------+-------+