离线数仓 (十一) -------- ODS 层搭建

柔情只为你懂 2024-04-01 11:09 134阅读 0赞

目录

  • 前言
  • 一、ODS层 (用户行为数据)
      1. 创建日志表 ods_log
      1. Shell 中单引号和双引号区别
      1. ODS 层日志表加载数据脚本
  • 二、ODS 层 (业务数据)
      1. 创建业务表
      1. ODS 层业务表首日数据装载脚本
      1. ODS层业务表每日数据装载脚本

前言

保持数据原貌不做任何修改,起到备份数据的作用。

数据采用 LZO 压缩,减少磁盘存储空间。100G 数据可以压缩到 10G 以内。

创建分区表,防止后续的全表扫描,在企业开发中大量使用分区表。

创建外部表。在企业开发中,除了自己用的临时表,创建内部表外,绝大多数场景都是创建外部表。


一、ODS层 (用户行为数据)

1. 创建日志表 ods_log

A、创建支持 lzo 压缩的分区表

建表语句

  1. hive (gmall)> drop table if exists ods_log;
  2. CREATE EXTERNAL TABLE ods_log (`line` string)
  3. PARTITIONED BY (`dt` string) -- 按照时间创建分区
  4. STORED AS -- 指定存储方式,读数据采用LzoTextInputFormat
  5. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  6. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  7. LOCATION '/warehouse/gmall/ods/ods_log' -- 指定数据在hdfs上的存储位置
  8. ;

说明 Hive 的 LZO 压缩:

https://cwiki.apache.org/confluence/display/Hive/LanguageManual+LZO

分区规划:

在这里插入图片描述

B、加载数据

数据装载

在这里插入图片描述

  1. hive (gmall)>
  2. load data inpath '/origin_data/gmall/log/topic_log/2020-06-14' into table ods_log partition(dt='2020-06-14');

注意:时间格式都配置成 YYYY-MM-DD 格式,这是 Hive 默认支持的时间格式

为lzo压缩文件创建索引

  1. [fancy@node101 bin]$ hadoop jar /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /warehouse/gmall/ods/ods_log/dt=2020-06-14

2. Shell 中单引号和双引号区别

A、在 /home/fancy/bin 创建一个 test.sh 文件

  1. [fancy@node101 bin]$ vim test.sh

在文件中添加如下内容

  1. #!/bin/bash
  2. do_date=$1
  3. echo '$do_date'
  4. echo "$do_date"
  5. echo "'$do_date'"
  6. echo '"$do_date"'
  7. echo `date`

B、查看执行结果

  1. [fancy@node101 bin]$ test.sh 2020-06-14
  2. $do_date
  3. 2020-06-14
  4. '2020-06-14'
  5. "$do_date"
  6. 2020 06 18 星期四 21:02:08 CST

C、总结

  • 单引号不取变量值
  • 双引号取变量值
  • 反引号`,执行引号中命令
  • 双引号内部嵌套单引号,取出变量值
  • 单引号内部嵌套双引号,不取出变量值

3. ODS 层日志表加载数据脚本

A、编写脚本

在 node101 的 /home/fancy/bin 目录下创建脚本

  1. [fancy@node101 bin]$ vim hdfs_to_ods_log.sh

在脚本中编写如下内容

  1. #!/bin/bash
  2. # 定义变量方便修改
  3. APP=gmall
  4. # 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天
  5. if [ -n "$1" ] ;then
  6. do_date=$1
  7. else
  8. do_date=`date -d "-1 day" +%F`
  9. fi
  10. echo ================== 日志日期为 $do_date ==================
  11. sql="
  12. load data inpath '/origin_data/$APP/log/topic_log/$do_date' into table ${APP}.ods_log partition(dt='$do_date');
  13. "
  14. hive -e "$sql"
  15. hadoop jar /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /warehouse/$APP/ods/ods_log/dt=$do_date

说明1:

  • [ -n 变量值 ] 判断变量的值,是否为空
  • – 变量的值,非空,返回 true
  • – 变量的值,为空,返回 false

注意:[ -n 变量值 ]不会解析数据,使用[ -n 变量值 ]时,需要对变量加上双引号(“ “)

说明2:

查看date命令的使用,date —help

增加脚本执行权限

  1. [fancy@node101 bin]$ chmod 777 hdfs_to_ods_log.sh

B、脚本使用

执行脚本

  1. [fancy@node101 module]$ hdfs_to_ods_log.sh 2020-06-14

查看导入数据

二、ODS 层 (业务数据)

ODS 层业务表分区规划如下

在这里插入图片描述
ODS 层业务表数据装载思路如下
在这里插入图片描述

1. 创建业务表

1.活动信息表

  1. DROP TABLE IF EXISTS ods_activity_info;
  2. CREATE EXTERNAL TABLE ods_activity_info(
  3. `id` STRING COMMENT '编号',
  4. `activity_name` STRING COMMENT '活动名称',
  5. `activity_type` STRING COMMENT '活动类型',
  6. `start_time` STRING COMMENT '开始时间',
  7. `end_time` STRING COMMENT '结束时间',
  8. `create_time` STRING COMMENT '创建时间'
  9. ) COMMENT '活动信息表'
  10. PARTITIONED BY (`dt` STRING)
  11. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  12. STORED AS
  13. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  14. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  15. LOCATION '/warehouse/gmall/ods/ods_activity_info/';

2. 活动规则表

  1. DROP TABLE IF EXISTS ods_activity_rule;
  2. CREATE EXTERNAL TABLE ods_activity_rule(
  3. `id` STRING COMMENT '编号',
  4. `activity_id` STRING COMMENT '活动ID',
  5. `activity_type` STRING COMMENT '活动类型',
  6. `condition_amount` DECIMAL(16,2) COMMENT '满减金额',
  7. `condition_num` BIGINT COMMENT '满减件数',
  8. `benefit_amount` DECIMAL(16,2) COMMENT '优惠金额',
  9. `benefit_discount` DECIMAL(16,2) COMMENT '优惠折扣',
  10. `benefit_level` STRING COMMENT '优惠级别'
  11. ) COMMENT '活动规则表'
  12. PARTITIONED BY (`dt` STRING)
  13. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  14. STORED AS
  15. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  16. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  17. LOCATION '/warehouse/gmall/ods/ods_activity_rule/';

3. 一级品类表

  1. DROP TABLE IF EXISTS ods_base_category1;
  2. CREATE EXTERNAL TABLE ods_base_category1(
  3. `id` STRING COMMENT 'id',
  4. `name` STRING COMMENT '名称'
  5. ) COMMENT '商品一级分类表'
  6. PARTITIONED BY (`dt` STRING)
  7. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  8. STORED AS
  9. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  10. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  11. LOCATION '/warehouse/gmall/ods/ods_base_category1/';

4. 二级品类表

  1. DROP TABLE IF EXISTS ods_base_category2;
  2. CREATE EXTERNAL TABLE ods_base_category2(
  3. `id` STRING COMMENT ' id',
  4. `name` STRING COMMENT '名称',
  5. `category1_id` STRING COMMENT '一级品类id'
  6. ) COMMENT '商品二级分类表'
  7. PARTITIONED BY (`dt` STRING)
  8. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  9. STORED AS
  10. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  11. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  12. LOCATION '/warehouse/gmall/ods/ods_base_category2/';

5. 三级品类表

  1. DROP TABLE IF EXISTS ods_base_category3;
  2. CREATE EXTERNAL TABLE ods_base_category3(
  3. `id` STRING COMMENT ' id',
  4. `name` STRING COMMENT '名称',
  5. `category2_id` STRING COMMENT '二级品类id'
  6. ) COMMENT '商品三级分类表'
  7. PARTITIONED BY (`dt` STRING)
  8. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  9. STORED AS
  10. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  11. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  12. LOCATION '/warehouse/gmall/ods/ods_base_category3/';

6. 编码字典表

  1. DROP TABLE IF EXISTS ods_base_dic;
  2. CREATE EXTERNAL TABLE ods_base_dic(
  3. `dic_code` STRING COMMENT '编号',
  4. `dic_name` STRING COMMENT '编码名称',
  5. `parent_code` STRING COMMENT '父编码',
  6. `create_time` STRING COMMENT '创建日期',
  7. `operate_time` STRING COMMENT '操作日期'
  8. ) COMMENT '编码字典表'
  9. PARTITIONED BY (`dt` STRING)
  10. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  11. STORED AS
  12. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  13. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  14. LOCATION '/warehouse/gmall/ods/ods_base_dic/';

7. 省份表

  1. DROP TABLE IF EXISTS ods_base_province;
  2. CREATE EXTERNAL TABLE ods_base_province (
  3. `id` STRING COMMENT '编号',
  4. `name` STRING COMMENT '省份名称',
  5. `region_id` STRING COMMENT '地区ID',
  6. `area_code` STRING COMMENT '地区编码',
  7. `iso_code` STRING COMMENT 'ISO-3166编码,供可视化使用',
  8. `iso_3166_2` STRING COMMENT 'IOS-3166-2编码,供可视化使用'
  9. ) COMMENT '省份表'
  10. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  11. STORED AS
  12. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  13. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  14. LOCATION '/warehouse/gmall/ods/ods_base_province/';

8. 地区表

  1. DROP TABLE IF EXISTS ods_base_region;
  2. CREATE EXTERNAL TABLE ods_base_region (
  3. `id` STRING COMMENT '编号',
  4. `region_name` STRING COMMENT '地区名称'
  5. ) COMMENT '地区表'
  6. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  7. STORED AS
  8. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  9. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  10. LOCATION '/warehouse/gmall/ods/ods_base_region/';

9. 品牌表

  1. DROP TABLE IF EXISTS ods_base_trademark;
  2. CREATE EXTERNAL TABLE ods_base_trademark (
  3. `id` STRING COMMENT '编号',
  4. `tm_name` STRING COMMENT '品牌名称'
  5. ) COMMENT '品牌表'
  6. PARTITIONED BY (`dt` STRING)
  7. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  8. STORED AS
  9. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  10. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  11. LOCATION '/warehouse/gmall/ods/ods_base_trademark/';

10. 购物车表

  1. DROP TABLE IF EXISTS ods_cart_info;
  2. CREATE EXTERNAL TABLE ods_cart_info(
  3. `id` STRING COMMENT '编号',
  4. `user_id` STRING COMMENT '用户id',
  5. `sku_id` STRING COMMENT 'skuid',
  6. `cart_price` DECIMAL(16,2) COMMENT '放入购物车时价格',
  7. `sku_num` BIGINT COMMENT '数量',
  8. `sku_name` STRING COMMENT 'sku名称 (冗余)',
  9. `create_time` STRING COMMENT '创建时间',
  10. `operate_time` STRING COMMENT '修改时间',
  11. `is_ordered` STRING COMMENT '是否已经下单',
  12. `order_time` STRING COMMENT '下单时间',
  13. `source_type` STRING COMMENT '来源类型',
  14. `source_id` STRING COMMENT '来源编号'
  15. ) COMMENT '加购表'
  16. PARTITIONED BY (`dt` STRING)
  17. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  18. STORED AS
  19. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  20. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  21. LOCATION '/warehouse/gmall/ods/ods_cart_info/';

11. 评论表

  1. DROP TABLE IF EXISTS ods_comment_info;
  2. CREATE EXTERNAL TABLE ods_comment_info(
  3. `id` STRING COMMENT '编号',
  4. `user_id` STRING COMMENT '用户ID',
  5. `sku_id` STRING COMMENT '商品sku',
  6. `spu_id` STRING COMMENT '商品spu',
  7. `order_id` STRING COMMENT '订单ID',
  8. `appraise` STRING COMMENT '评价',
  9. `create_time` STRING COMMENT '评价时间'
  10. ) COMMENT '商品评论表'
  11. PARTITIONED BY (`dt` STRING)
  12. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  13. STORED AS
  14. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  15. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  16. LOCATION '/warehouse/gmall/ods/ods_comment_info/';

12. 优惠券信息表

  1. DROP TABLE IF EXISTS ods_coupon_info;
  2. CREATE EXTERNAL TABLE ods_coupon_info(
  3. `id` STRING COMMENT '购物券编号',
  4. `coupon_name` STRING COMMENT '购物券名称',
  5. `coupon_type` STRING COMMENT '购物券类型 1 现金券 2 折扣券 3 满减券 4 满件打折券',
  6. `condition_amount` DECIMAL(16,2) COMMENT '满额数',
  7. `condition_num` BIGINT COMMENT '满件数',
  8. `activity_id` STRING COMMENT '活动编号',
  9. `benefit_amount` DECIMAL(16,2) COMMENT '减金额',
  10. `benefit_discount` DECIMAL(16,2) COMMENT '折扣',
  11. `create_time` STRING COMMENT '创建时间',
  12. `range_type` STRING COMMENT '范围类型 1、商品 2、品类 3、品牌',
  13. `limit_num` BIGINT COMMENT '最多领用次数',
  14. `taken_count` BIGINT COMMENT '已领用次数',
  15. `start_time` STRING COMMENT '开始领取时间',
  16. `end_time` STRING COMMENT '结束领取时间',
  17. `operate_time` STRING COMMENT '修改时间',
  18. `expire_time` STRING COMMENT '过期时间'
  19. ) COMMENT '优惠券表'
  20. PARTITIONED BY (`dt` STRING)
  21. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  22. STORED AS
  23. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  24. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  25. LOCATION '/warehouse/gmall/ods/ods_coupon_info/';

13. 优惠券领用表

  1. DROP TABLE IF EXISTS ods_coupon_use;
  2. CREATE EXTERNAL TABLE ods_coupon_use(
  3. `id` STRING COMMENT '编号',
  4. `coupon_id` STRING COMMENT '优惠券ID',
  5. `user_id` STRING COMMENT 'skuid',
  6. `order_id` STRING COMMENT 'spuid',
  7. `coupon_status` STRING COMMENT '优惠券状态',
  8. `get_time` STRING COMMENT '领取时间',
  9. `using_time` STRING COMMENT '使用时间(下单)',
  10. `used_time` STRING COMMENT '使用时间(支付)',
  11. `expire_time` STRING COMMENT '过期时间'
  12. ) COMMENT '优惠券领用表'
  13. PARTITIONED BY (`dt` STRING)
  14. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  15. STORED AS
  16. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  17. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  18. LOCATION '/warehouse/gmall/ods/ods_coupon_use/';

14. 收藏表

  1. DROP TABLE IF EXISTS ods_favor_info;
  2. CREATE EXTERNAL TABLE ods_favor_info(
  3. `id` STRING COMMENT '编号',
  4. `user_id` STRING COMMENT '用户id',
  5. `sku_id` STRING COMMENT 'skuid',
  6. `spu_id` STRING COMMENT 'spuid',
  7. `is_cancel` STRING COMMENT '是否取消',
  8. `create_time` STRING COMMENT '收藏时间',
  9. `cancel_time` STRING COMMENT '取消时间'
  10. ) COMMENT '商品收藏表'
  11. PARTITIONED BY (`dt` STRING)
  12. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  13. STORED AS
  14. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  15. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  16. LOCATION '/warehouse/gmall/ods/ods_favor_info/';

15. 订单明细表

  1. DROP TABLE IF EXISTS ods_order_detail;
  2. CREATE EXTERNAL TABLE ods_order_detail(
  3. `id` STRING COMMENT '编号',
  4. `order_id` STRING COMMENT '订单号',
  5. `sku_id` STRING COMMENT '商品id',
  6. `sku_name` STRING COMMENT '商品名称',
  7. `order_price` DECIMAL(16,2) COMMENT '商品价格',
  8. `sku_num` BIGINT COMMENT '商品数量',
  9. `create_time` STRING COMMENT '创建时间',
  10. `source_type` STRING COMMENT '来源类型',
  11. `source_id` STRING COMMENT '来源编号',
  12. `split_final_amount` DECIMAL(16,2) COMMENT '分摊最终金额',
  13. `split_activity_amount` DECIMAL(16,2) COMMENT '分摊活动优惠',
  14. `split_coupon_amount` DECIMAL(16,2) COMMENT '分摊优惠券优惠'
  15. ) COMMENT '订单详情表'
  16. PARTITIONED BY (`dt` STRING)
  17. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  18. STORED AS
  19. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  20. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  21. LOCATION '/warehouse/gmall/ods/ods_order_detail/';

16. 订单明细活动关联表

  1. DROP TABLE IF EXISTS ods_order_detail_activity;
  2. CREATE EXTERNAL TABLE ods_order_detail_activity(
  3. `id` STRING COMMENT '编号',
  4. `order_id` STRING COMMENT '订单号',
  5. `order_detail_id` STRING COMMENT '订单明细id',
  6. `activity_id` STRING COMMENT '活动id',
  7. `activity_rule_id` STRING COMMENT '活动规则id',
  8. `sku_id` BIGINT COMMENT '商品id',
  9. `create_time` STRING COMMENT '创建时间'
  10. ) COMMENT '订单详情活动关联表'
  11. PARTITIONED BY (`dt` STRING)
  12. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  13. STORED AS
  14. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  15. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  16. LOCATION '/warehouse/gmall/ods/ods_order_detail_activity/';

17. 订单明细优惠券关联表

  1. DROP TABLE IF EXISTS ods_order_detail_coupon;
  2. CREATE EXTERNAL TABLE ods_order_detail_coupon(
  3. `id` STRING COMMENT '编号',
  4. `order_id` STRING COMMENT '订单号',
  5. `order_detail_id` STRING COMMENT '订单明细id',
  6. `coupon_id` STRING COMMENT '优惠券id',
  7. `coupon_use_id` STRING COMMENT '优惠券领用记录id',
  8. `sku_id` STRING COMMENT '商品id',
  9. `create_time` STRING COMMENT '创建时间'
  10. ) COMMENT '订单详情活动关联表'
  11. PARTITIONED BY (`dt` STRING)
  12. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  13. STORED AS
  14. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  15. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  16. LOCATION '/warehouse/gmall/ods/ods_order_detail_coupon/';

18. 订单表

  1. DROP TABLE IF EXISTS ods_order_info;
  2. CREATE EXTERNAL TABLE ods_order_info (
  3. `id` STRING COMMENT '订单号',
  4. `final_amount` DECIMAL(16,2) COMMENT '订单最终金额',
  5. `order_status` STRING COMMENT '订单状态',
  6. `user_id` STRING COMMENT '用户id',
  7. `payment_way` STRING COMMENT '支付方式',
  8. `delivery_address` STRING COMMENT '送货地址',
  9. `out_trade_no` STRING COMMENT '支付流水号',
  10. `create_time` STRING COMMENT '创建时间',
  11. `operate_time` STRING COMMENT '操作时间',
  12. `expire_time` STRING COMMENT '过期时间',
  13. `tracking_no` STRING COMMENT '物流单编号',
  14. `province_id` STRING COMMENT '省份ID',
  15. `activity_reduce_amount` DECIMAL(16,2) COMMENT '活动减免金额',
  16. `coupon_reduce_amount` DECIMAL(16,2) COMMENT '优惠券减免金额',
  17. `original_amount` DECIMAL(16,2) COMMENT '订单原价金额',
  18. `feight_fee` DECIMAL(16,2) COMMENT '运费',
  19. `feight_fee_reduce` DECIMAL(16,2) COMMENT '运费减免'
  20. ) COMMENT '订单表'
  21. PARTITIONED BY (`dt` STRING)
  22. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  23. STORED AS
  24. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  25. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  26. LOCATION '/warehouse/gmall/ods/ods_order_info/';

19. 退单表

  1. DROP TABLE IF EXISTS ods_order_refund_info;
  2. CREATE EXTERNAL TABLE ods_order_refund_info(
  3. `id` STRING COMMENT '编号',
  4. `user_id` STRING COMMENT '用户ID',
  5. `order_id` STRING COMMENT '订单ID',
  6. `sku_id` STRING COMMENT '商品ID',
  7. `refund_type` STRING COMMENT '退单类型',
  8. `refund_num` BIGINT COMMENT '退单件数',
  9. `refund_amount` DECIMAL(16,2) COMMENT '退单金额',
  10. `refund_reason_type` STRING COMMENT '退单原因类型',
  11. `refund_status` STRING COMMENT '退单状态',--退单状态应包含买家申请、卖家审核、卖家收货、退款完成等状态。此处未涉及到,故该表按增量处理
  12. `create_time` STRING COMMENT '退单时间'
  13. ) COMMENT '退单表'
  14. PARTITIONED BY (`dt` STRING)
  15. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  16. STORED AS
  17. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  18. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  19. LOCATION '/warehouse/gmall/ods/ods_order_refund_info/';

20. 订单状态日志表

  1. DROP TABLE IF EXISTS ods_order_status_log;
  2. CREATE EXTERNAL TABLE ods_order_status_log (
  3. `id` STRING COMMENT '编号',
  4. `order_id` STRING COMMENT '订单ID',
  5. `order_status` STRING COMMENT '订单状态',
  6. `operate_time` STRING COMMENT '修改时间'
  7. ) COMMENT '订单状态表'
  8. PARTITIONED BY (`dt` STRING)
  9. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  10. STORED AS
  11. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  12. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  13. LOCATION '/warehouse/gmall/ods/ods_order_status_log/';

21. 支付表

  1. DROP TABLE IF EXISTS ods_payment_info;
  2. CREATE EXTERNAL TABLE ods_payment_info(
  3. `id` STRING COMMENT '编号',
  4. `out_trade_no` STRING COMMENT '对外业务编号',
  5. `order_id` STRING COMMENT '订单编号',
  6. `user_id` STRING COMMENT '用户编号',
  7. `payment_type` STRING COMMENT '支付类型',
  8. `trade_no` STRING COMMENT '交易编号',
  9. `payment_amount` DECIMAL(16,2) COMMENT '支付金额',
  10. `subject` STRING COMMENT '交易内容',
  11. `payment_status` STRING COMMENT '支付状态',
  12. `create_time` STRING COMMENT '创建时间',
  13. `callback_time` STRING COMMENT '回调时间'
  14. ) COMMENT '支付流水表'
  15. PARTITIONED BY (`dt` STRING)
  16. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  17. STORED AS
  18. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  19. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  20. LOCATION '/warehouse/gmall/ods/ods_payment_info/';

22. 退款表

  1. DROP TABLE IF EXISTS ods_refund_payment;
  2. CREATE EXTERNAL TABLE ods_refund_payment(
  3. `id` STRING COMMENT '编号',
  4. `out_trade_no` STRING COMMENT '对外业务编号',
  5. `order_id` STRING COMMENT '订单编号',
  6. `sku_id` STRING COMMENT 'SKU编号',
  7. `payment_type` STRING COMMENT '支付类型',
  8. `trade_no` STRING COMMENT '交易编号',
  9. `refund_amount` DECIMAL(16,2) COMMENT '支付金额',
  10. `subject` STRING COMMENT '交易内容',
  11. `refund_status` STRING COMMENT '支付状态',
  12. `create_time` STRING COMMENT '创建时间',
  13. `callback_time` STRING COMMENT '回调时间'
  14. ) COMMENT '支付流水表'
  15. PARTITIONED BY (`dt` STRING)
  16. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  17. STORED AS
  18. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  19. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  20. LOCATION '/warehouse/gmall/ods/ods_refund_payment/';

23. 商品平台属性表

  1. DROP TABLE IF EXISTS ods_sku_attr_value;
  2. CREATE EXTERNAL TABLE ods_sku_attr_value(
  3. `id` STRING COMMENT '编号',
  4. `attr_id` STRING COMMENT '平台属性ID',
  5. `value_id` STRING COMMENT '平台属性值ID',
  6. `sku_id` STRING COMMENT '商品ID',
  7. `attr_name` STRING COMMENT '平台属性名称',
  8. `value_name` STRING COMMENT '平台属性值名称'
  9. ) COMMENT 'sku平台属性表'
  10. PARTITIONED BY (`dt` STRING)
  11. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  12. STORED AS
  13. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  14. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  15. LOCATION '/warehouse/gmall/ods/ods_sku_attr_value/';

24. 商品(SKU)表

  1. DROP TABLE IF EXISTS ods_sku_info;
  2. CREATE EXTERNAL TABLE ods_sku_info(
  3. `id` STRING COMMENT 'skuId',
  4. `spu_id` STRING COMMENT 'spuid',
  5. `price` DECIMAL(16,2) COMMENT '价格',
  6. `sku_name` STRING COMMENT '商品名称',
  7. `sku_desc` STRING COMMENT '商品描述',
  8. `weight` DECIMAL(16,2) COMMENT '重量',
  9. `tm_id` STRING COMMENT '品牌id',
  10. `category3_id` STRING COMMENT '品类id',
  11. `is_sale` STRING COMMENT '是否在售',
  12. `create_time` STRING COMMENT '创建时间'
  13. ) COMMENT 'SKU商品表'
  14. PARTITIONED BY (`dt` STRING)
  15. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  16. STORED AS
  17. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  18. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  19. LOCATION '/warehouse/gmall/ods/ods_sku_info/';

25. 商品销售属性表

  1. DROP TABLE IF EXISTS ods_sku_sale_attr_value;
  2. CREATE EXTERNAL TABLE ods_sku_sale_attr_value(
  3. `id` STRING COMMENT '编号',
  4. `sku_id` STRING COMMENT 'sku_id',
  5. `spu_id` STRING COMMENT 'spu_id',
  6. `sale_attr_value_id` STRING COMMENT '销售属性值id',
  7. `sale_attr_id` STRING COMMENT '销售属性id',
  8. `sale_attr_name` STRING COMMENT '销售属性名称',
  9. `sale_attr_value_name` STRING COMMENT '销售属性值名称'
  10. ) COMMENT 'sku销售属性名称'
  11. PARTITIONED BY (`dt` STRING)
  12. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  13. STORED AS
  14. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  15. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  16. LOCATION '/warehouse/gmall/ods/ods_sku_sale_attr_value/';

26. 商品(SPU)表

  1. DROP TABLE IF EXISTS ods_spu_info;
  2. CREATE EXTERNAL TABLE ods_spu_info(
  3. `id` STRING COMMENT 'spuid',
  4. `spu_name` STRING COMMENT 'spu名称',
  5. `category3_id` STRING COMMENT '品类id',
  6. `tm_id` STRING COMMENT '品牌id'
  7. ) COMMENT 'SPU商品表'
  8. PARTITIONED BY (`dt` STRING)
  9. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  10. STORED AS
  11. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  12. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  13. LOCATION '/warehouse/gmall/ods/ods_spu_info/';

27. 用户表

  1. DROP TABLE IF EXISTS ods_user_info;
  2. CREATE EXTERNAL TABLE ods_user_info(
  3. `id` STRING COMMENT '用户id',
  4. `login_name` STRING COMMENT '用户名称',
  5. `nick_name` STRING COMMENT '用户昵称',
  6. `name` STRING COMMENT '用户姓名',
  7. `phone_num` STRING COMMENT '手机号码',
  8. `email` STRING COMMENT '邮箱',
  9. `user_level` STRING COMMENT '用户等级',
  10. `birthday` STRING COMMENT '生日',
  11. `gender` STRING COMMENT '性别',
  12. `create_time` STRING COMMENT '创建时间',
  13. `operate_time` STRING COMMENT '操作时间'
  14. ) COMMENT '用户表'
  15. PARTITIONED BY (`dt` STRING)
  16. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
  17. STORED AS
  18. INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat'
  19. OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
  20. LOCATION '/warehouse/gmall/ods/ods_user_info/';

2. ODS 层业务表首日数据装载脚本

A、编写脚本

(1)在 /home/fancy/bin 目录下创建脚本 hdfs_to_ods_db_init.sh

  1. [fancy@node101 bin]$ vim hdfs_to_ods_db_init.sh

在脚本中填写如下内容

  1. #!/bin/bash
  2. APP=gmall
  3. if [ -n "$2" ] ;then
  4. do_date=$2
  5. else
  6. echo "请传入日期参数"
  7. exit
  8. fi
  9. ods_order_info="
  10. load data inpath '/origin_data/$APP/db/order_info/$do_date' OVERWRITE into table ${APP}.ods_order_info partition(dt='$do_date');"
  11. ods_order_detail="
  12. load data inpath '/origin_data/$APP/db/order_detail/$do_date' OVERWRITE into table ${APP}.ods_order_detail partition(dt='$do_date');"
  13. ods_sku_info="
  14. load data inpath '/origin_data/$APP/db/sku_info/$do_date' OVERWRITE into table ${APP}.ods_sku_info partition(dt='$do_date');"
  15. ods_user_info="
  16. load data inpath '/origin_data/$APP/db/user_info/$do_date' OVERWRITE into table ${APP}.ods_user_info partition(dt='$do_date');"
  17. ods_payment_info="
  18. load data inpath '/origin_data/$APP/db/payment_info/$do_date' OVERWRITE into table ${APP}.ods_payment_info partition(dt='$do_date');"
  19. ods_base_category1="
  20. load data inpath '/origin_data/$APP/db/base_category1/$do_date' OVERWRITE into table ${APP}.ods_base_category1 partition(dt='$do_date');"
  21. ods_base_category2="
  22. load data inpath '/origin_data/$APP/db/base_category2/$do_date' OVERWRITE into table ${APP}.ods_base_category2 partition(dt='$do_date');"
  23. ods_base_category3="
  24. load data inpath '/origin_data/$APP/db/base_category3/$do_date' OVERWRITE into table ${APP}.ods_base_category3 partition(dt='$do_date'); "
  25. ods_base_trademark="
  26. load data inpath '/origin_data/$APP/db/base_trademark/$do_date' OVERWRITE into table ${APP}.ods_base_trademark partition(dt='$do_date'); "
  27. ods_activity_info="
  28. load data inpath '/origin_data/$APP/db/activity_info/$do_date' OVERWRITE into table ${APP}.ods_activity_info partition(dt='$do_date'); "
  29. ods_cart_info="
  30. load data inpath '/origin_data/$APP/db/cart_info/$do_date' OVERWRITE into table ${APP}.ods_cart_info partition(dt='$do_date'); "
  31. ods_comment_info="
  32. load data inpath '/origin_data/$APP/db/comment_info/$do_date' OVERWRITE into table ${APP}.ods_comment_info partition(dt='$do_date'); "
  33. ods_coupon_info="
  34. load data inpath '/origin_data/$APP/db/coupon_info/$do_date' OVERWRITE into table ${APP}.ods_coupon_info partition(dt='$do_date'); "
  35. ods_coupon_use="
  36. load data inpath '/origin_data/$APP/db/coupon_use/$do_date' OVERWRITE into table ${APP}.ods_coupon_use partition(dt='$do_date'); "
  37. ods_favor_info="
  38. load data inpath '/origin_data/$APP/db/favor_info/$do_date' OVERWRITE into table ${APP}.ods_favor_info partition(dt='$do_date'); "
  39. ods_order_refund_info="
  40. load data inpath '/origin_data/$APP/db/order_refund_info/$do_date' OVERWRITE into table ${APP}.ods_order_refund_info partition(dt='$do_date'); "
  41. ods_order_status_log="
  42. load data inpath '/origin_data/$APP/db/order_status_log/$do_date' OVERWRITE into table ${APP}.ods_order_status_log partition(dt='$do_date'); "
  43. ods_spu_info="
  44. load data inpath '/origin_data/$APP/db/spu_info/$do_date' OVERWRITE into table ${APP}.ods_spu_info partition(dt='$do_date'); "
  45. ods_activity_rule="
  46. load data inpath '/origin_data/$APP/db/activity_rule/$do_date' OVERWRITE into table ${APP}.ods_activity_rule partition(dt='$do_date');"
  47. ods_base_dic="
  48. load data inpath '/origin_data/$APP/db/base_dic/$do_date' OVERWRITE into table ${APP}.ods_base_dic partition(dt='$do_date'); "
  49. ods_order_detail_activity="
  50. load data inpath '/origin_data/$APP/db/order_detail_activity/$do_date' OVERWRITE into table ${APP}.ods_order_detail_activity partition(dt='$do_date'); "
  51. ods_order_detail_coupon="
  52. load data inpath '/origin_data/$APP/db/order_detail_coupon/$do_date' OVERWRITE into table ${APP}.ods_order_detail_coupon partition(dt='$do_date'); "
  53. ods_refund_payment="
  54. load data inpath '/origin_data/$APP/db/refund_payment/$do_date' OVERWRITE into table ${APP}.ods_refund_payment partition(dt='$do_date'); "
  55. ods_sku_attr_value="
  56. load data inpath '/origin_data/$APP/db/sku_attr_value/$do_date' OVERWRITE into table ${APP}.ods_sku_attr_value partition(dt='$do_date'); "
  57. ods_sku_sale_attr_value="
  58. load data inpath '/origin_data/$APP/db/sku_sale_attr_value/$do_date' OVERWRITE into table ${APP}.ods_sku_sale_attr_value partition(dt='$do_date'); "
  59. ods_base_province="
  60. load data inpath '/origin_data/$APP/db/base_province/$do_date' OVERWRITE into table ${APP}.ods_base_province;"
  61. ods_base_region="
  62. load data inpath '/origin_data/$APP/db/base_region/$do_date' OVERWRITE into table ${APP}.ods_base_region;"
  63. case $1 in
  64. "ods_order_info"){
  65. hive -e "$ods_order_info"
  66. };;
  67. "ods_order_detail"){
  68. hive -e "$ods_order_detail"
  69. };;
  70. "ods_sku_info"){
  71. hive -e "$ods_sku_info"
  72. };;
  73. "ods_user_info"){
  74. hive -e "$ods_user_info"
  75. };;
  76. "ods_payment_info"){
  77. hive -e "$ods_payment_info"
  78. };;
  79. "ods_base_category1"){
  80. hive -e "$ods_base_category1"
  81. };;
  82. "ods_base_category2"){
  83. hive -e "$ods_base_category2"
  84. };;
  85. "ods_base_category3"){
  86. hive -e "$ods_base_category3"
  87. };;
  88. "ods_base_trademark"){
  89. hive -e "$ods_base_trademark"
  90. };;
  91. "ods_activity_info"){
  92. hive -e "$ods_activity_info"
  93. };;
  94. "ods_cart_info"){
  95. hive -e "$ods_cart_info"
  96. };;
  97. "ods_comment_info"){
  98. hive -e "$ods_comment_info"
  99. };;
  100. "ods_coupon_info"){
  101. hive -e "$ods_coupon_info"
  102. };;
  103. "ods_coupon_use"){
  104. hive -e "$ods_coupon_use"
  105. };;
  106. "ods_favor_info"){
  107. hive -e "$ods_favor_info"
  108. };;
  109. "ods_order_refund_info"){
  110. hive -e "$ods_order_refund_info"
  111. };;
  112. "ods_order_status_log"){
  113. hive -e "$ods_order_status_log"
  114. };;
  115. "ods_spu_info"){
  116. hive -e "$ods_spu_info"
  117. };;
  118. "ods_activity_rule"){
  119. hive -e "$ods_activity_rule"
  120. };;
  121. "ods_base_dic"){
  122. hive -e "$ods_base_dic"
  123. };;
  124. "ods_order_detail_activity"){
  125. hive -e "$ods_order_detail_activity"
  126. };;
  127. "ods_order_detail_coupon"){
  128. hive -e "$ods_order_detail_coupon"
  129. };;
  130. "ods_refund_payment"){
  131. hive -e "$ods_refund_payment"
  132. };;
  133. "ods_sku_attr_value"){
  134. hive -e "$ods_sku_attr_value"
  135. };;
  136. "ods_sku_sale_attr_value"){
  137. hive -e "$ods_sku_sale_attr_value"
  138. };;
  139. "ods_base_province"){
  140. hive -e "$ods_base_province"
  141. };;
  142. "ods_base_region"){
  143. hive -e "$ods_base_region"
  144. };;
  145. "all"){
  146. hive -e "$ods_order_info$ods_order_detail$ods_sku_info$ods_user_info$ods_payment_info$ods_base_category1$ods_base_category2$ods_base_category3$ods_base_trademark$ods_activity_info$ods_cart_info$ods_comment_info$ods_coupon_info$ods_coupon_use$ods_favor_info$ods_order_refund_info$ods_order_status_log$ods_spu_info$ods_activity_rule$ods_base_dic$ods_order_detail_activity$ods_order_detail_coupon$ods_refund_payment$ods_sku_attr_value$ods_sku_sale_attr_value$ods_base_province$ods_base_region"
  147. };;
  148. esac

(2)增加执行权限

  1. [fancy@node101 bin]$ chmod +x hdfs_to_ods_db_init.sh

B、脚本使用

(1)执行脚本

  1. [fancy@node101 bin]$ hdfs_to_ods_db_init.sh all 2020-06-14

(2)查看数据是否导入成功

3. ODS层业务表每日数据装载脚本

A、编写脚本

(1)在 /home/fancy/bin 目录下创建脚本 hdfs_to_ods_db.sh

  1. [fancy@node101 bin]$ vim hdfs_to_ods_db.sh

在脚本中填写如下内容

  1. #!/bin/bash
  2. APP=gmall
  3. # 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天
  4. if [ -n "$2" ] ;then
  5. do_date=$2
  6. else
  7. do_date=`date -d "-1 day" +%F`
  8. fi
  9. ods_order_info="
  10. load data inpath '/origin_data/$APP/db/order_info/$do_date' OVERWRITE into table ${APP}.ods_order_info partition(dt='$do_date');"
  11. ods_order_detail="
  12. load data inpath '/origin_data/$APP/db/order_detail/$do_date' OVERWRITE into table ${APP}.ods_order_detail partition(dt='$do_date');"
  13. ods_sku_info="
  14. load data inpath '/origin_data/$APP/db/sku_info/$do_date' OVERWRITE into table ${APP}.ods_sku_info partition(dt='$do_date');"
  15. ods_user_info="
  16. load data inpath '/origin_data/$APP/db/user_info/$do_date' OVERWRITE into table ${APP}.ods_user_info partition(dt='$do_date');"
  17. ods_payment_info="
  18. load data inpath '/origin_data/$APP/db/payment_info/$do_date' OVERWRITE into table ${APP}.ods_payment_info partition(dt='$do_date');"
  19. ods_base_category1="
  20. load data inpath '/origin_data/$APP/db/base_category1/$do_date' OVERWRITE into table ${APP}.ods_base_category1 partition(dt='$do_date');"
  21. ods_base_category2="
  22. load data inpath '/origin_data/$APP/db/base_category2/$do_date' OVERWRITE into table ${APP}.ods_base_category2 partition(dt='$do_date');"
  23. ods_base_category3="
  24. load data inpath '/origin_data/$APP/db/base_category3/$do_date' OVERWRITE into table ${APP}.ods_base_category3 partition(dt='$do_date'); "
  25. ods_base_trademark="
  26. load data inpath '/origin_data/$APP/db/base_trademark/$do_date' OVERWRITE into table ${APP}.ods_base_trademark partition(dt='$do_date'); "
  27. ods_activity_info="
  28. load data inpath '/origin_data/$APP/db/activity_info/$do_date' OVERWRITE into table ${APP}.ods_activity_info partition(dt='$do_date'); "
  29. ods_cart_info="
  30. load data inpath '/origin_data/$APP/db/cart_info/$do_date' OVERWRITE into table ${APP}.ods_cart_info partition(dt='$do_date'); "
  31. ods_comment_info="
  32. load data inpath '/origin_data/$APP/db/comment_info/$do_date' OVERWRITE into table ${APP}.ods_comment_info partition(dt='$do_date'); "
  33. ods_coupon_info="
  34. load data inpath '/origin_data/$APP/db/coupon_info/$do_date' OVERWRITE into table ${APP}.ods_coupon_info partition(dt='$do_date'); "
  35. ods_coupon_use="
  36. load data inpath '/origin_data/$APP/db/coupon_use/$do_date' OVERWRITE into table ${APP}.ods_coupon_use partition(dt='$do_date'); "
  37. ods_favor_info="
  38. load data inpath '/origin_data/$APP/db/favor_info/$do_date' OVERWRITE into table ${APP}.ods_favor_info partition(dt='$do_date'); "
  39. ods_order_refund_info="
  40. load data inpath '/origin_data/$APP/db/order_refund_info/$do_date' OVERWRITE into table ${APP}.ods_order_refund_info partition(dt='$do_date'); "
  41. ods_order_status_log="
  42. load data inpath '/origin_data/$APP/db/order_status_log/$do_date' OVERWRITE into table ${APP}.ods_order_status_log partition(dt='$do_date'); "
  43. ods_spu_info="
  44. load data inpath '/origin_data/$APP/db/spu_info/$do_date' OVERWRITE into table ${APP}.ods_spu_info partition(dt='$do_date'); "
  45. ods_activity_rule="
  46. load data inpath '/origin_data/$APP/db/activity_rule/$do_date' OVERWRITE into table ${APP}.ods_activity_rule partition(dt='$do_date');"
  47. ods_base_dic="
  48. load data inpath '/origin_data/$APP/db/base_dic/$do_date' OVERWRITE into table ${APP}.ods_base_dic partition(dt='$do_date'); "
  49. ods_order_detail_activity="
  50. load data inpath '/origin_data/$APP/db/order_detail_activity/$do_date' OVERWRITE into table ${APP}.ods_order_detail_activity partition(dt='$do_date'); "
  51. ods_order_detail_coupon="
  52. load data inpath '/origin_data/$APP/db/order_detail_coupon/$do_date' OVERWRITE into table ${APP}.ods_order_detail_coupon partition(dt='$do_date'); "
  53. ods_refund_payment="
  54. load data inpath '/origin_data/$APP/db/refund_payment/$do_date' OVERWRITE into table ${APP}.ods_refund_payment partition(dt='$do_date'); "
  55. ods_sku_attr_value="
  56. load data inpath '/origin_data/$APP/db/sku_attr_value/$do_date' OVERWRITE into table ${APP}.ods_sku_attr_value partition(dt='$do_date'); "
  57. ods_sku_sale_attr_value="
  58. load data inpath '/origin_data/$APP/db/sku_sale_attr_value/$do_date' OVERWRITE into table ${APP}.ods_sku_sale_attr_value partition(dt='$do_date'); "
  59. ods_base_province="
  60. load data inpath '/origin_data/$APP/db/base_province/$do_date' OVERWRITE into table ${APP}.ods_base_province;"
  61. ods_base_region="
  62. load data inpath '/origin_data/$APP/db/base_region/$do_date' OVERWRITE into table ${APP}.ods_base_region;"
  63. case $1 in
  64. "ods_order_info"){
  65. hive -e "$ods_order_info"
  66. };;
  67. "ods_order_detail"){
  68. hive -e "$ods_order_detail"
  69. };;
  70. "ods_sku_info"){
  71. hive -e "$ods_sku_info"
  72. };;
  73. "ods_user_info"){
  74. hive -e "$ods_user_info"
  75. };;
  76. "ods_payment_info"){
  77. hive -e "$ods_payment_info"
  78. };;
  79. "ods_base_category1"){
  80. hive -e "$ods_base_category1"
  81. };;
  82. "ods_base_category2"){
  83. hive -e "$ods_base_category2"
  84. };;
  85. "ods_base_category3"){
  86. hive -e "$ods_base_category3"
  87. };;
  88. "ods_base_trademark"){
  89. hive -e "$ods_base_trademark"
  90. };;
  91. "ods_activity_info"){
  92. hive -e "$ods_activity_info"
  93. };;
  94. "ods_cart_info"){
  95. hive -e "$ods_cart_info"
  96. };;
  97. "ods_comment_info"){
  98. hive -e "$ods_comment_info"
  99. };;
  100. "ods_coupon_info"){
  101. hive -e "$ods_coupon_info"
  102. };;
  103. "ods_coupon_use"){
  104. hive -e "$ods_coupon_use"
  105. };;
  106. "ods_favor_info"){
  107. hive -e "$ods_favor_info"
  108. };;
  109. "ods_order_refund_info"){
  110. hive -e "$ods_order_refund_info"
  111. };;
  112. "ods_order_status_log"){
  113. hive -e "$ods_order_status_log"
  114. };;
  115. "ods_spu_info"){
  116. hive -e "$ods_spu_info"
  117. };;
  118. "ods_activity_rule"){
  119. hive -e "$ods_activity_rule"
  120. };;
  121. "ods_base_dic"){
  122. hive -e "$ods_base_dic"
  123. };;
  124. "ods_order_detail_activity"){
  125. hive -e "$ods_order_detail_activity"
  126. };;
  127. "ods_order_detail_coupon"){
  128. hive -e "$ods_order_detail_coupon"
  129. };;
  130. "ods_refund_payment"){
  131. hive -e "$ods_refund_payment"
  132. };;
  133. "ods_sku_attr_value"){
  134. hive -e "$ods_sku_attr_value"
  135. };;
  136. "ods_sku_sale_attr_value"){
  137. hive -e "$ods_sku_sale_attr_value"
  138. };;
  139. "all"){
  140. hive -e "$ods_order_info$ods_order_detail$ods_sku_info$ods_user_info$ods_payment_info$ods_base_category1$ods_base_category2$ods_base_category3$ods_base_trademark$ods_activity_info$ods_cart_info$ods_comment_info$ods_coupon_info$ods_coupon_use$ods_favor_info$ods_order_refund_info$ods_order_status_log$ods_spu_info$ods_activity_rule$ods_base_dic$ods_order_detail_activity$ods_order_detail_coupon$ods_refund_payment$ods_sku_attr_value$ods_sku_sale_attr_value"
  141. };;
  142. esac

(2)修改权限

  1. [fancy@node101 bin]$ chmod +x hdfs_to_ods_db.sh

B、脚本使用

(1)执行脚本

  1. [fancy@node101 bin]$ hdfs_to_ods_db.sh all 2020-06-14

(2)查看数据是否导入成功

发表评论

表情:
评论列表 (有 0 条评论,134人围观)

还没有评论,来说两句吧...

相关阅读