窗口函数两次执行结果不一致 #1277
Unanswered
whitecloud6688
asked this question in
Q&A
窗口函数两次执行结果不一致
#1277
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
请问一样的窗口函数语句,为啥两次执行的结果不一样呢?谢谢
Flink SQL> SELECT window_start
Flink SQL> SELECT window_start
Flink SQL> select * from bid;
+----+----------------------+-------------------------+--------------------------------+--------------------------------+
| op | uuid | bidtime | price | item |
+----+----------------------+-------------------------+--------------------------------+--------------------------------+
| +I | 1 | 2020-04-15 08:05:00.000 | 4.0 | C |
| +I | 2 | 2020-04-15 08:07:00.000 | 2.0 | A |
| +I | 3 | 2020-04-15 08:09:00.000 | 5.0 | D |
| +I | 4 | 2020-04-15 08:11:00.000 | 3.0 | B |
| +I | 5 | 2020-04-15 08:13:00.000 | 1.0 | E |
| +I | 6 | 2020-04-15 08:17:00.000 | 6.0 | F |
| +I | 7 | 2020-04-15 10:11:00.000 | 7.0 | B |
| +I | 8 | 2020-04-16 08:13:00.000 | 1.0 | E |
| +I | 9 | 2020-05-15 08:17:00.000 | 6.0 | F |
| +I | 10 | 2019-05-15 08:17:00.000 | 6.0 | F |
+----+----------------------+-------------------------+--------------------------------+--------------------------------+
Received a total of 10 rows
建表语句
create table if not exists bid (
uuid bigint
,bidtime timestamp(3)
, price float
, item string
, watermark for bidtime as bidtime - interval '1' second
) with (
'connector' = 'hudi'
, 'table.type' = 'COPY_ON_WRITE'
, 'path' = '/hive/warehouse/test.db/bid'
, 'read.streaming.enable' = 'false'
, 'read.streaming.check-interval' = '10'
);
insert into bid values
(1 ,to_timestamp('2020-04-15 08:05', 'yyyy-MM-dd HH:mm'), 4.00 ,'C')
,(2 ,to_timestamp('2020-04-15 08:07', 'yyyy-MM-dd HH:mm'), 2.00 ,'A')
,(3 ,to_timestamp('2020-04-15 08:09', 'yyyy-MM-dd HH:mm'), 5.00 ,'D')
,(4 ,to_timestamp('2020-04-15 08:11', 'yyyy-MM-dd HH:mm'), 3.00 ,'B')
,(5 ,to_timestamp('2020-04-15 08:13', 'yyyy-MM-dd HH:mm'), 1.00 ,'E')
,(6 ,to_timestamp('2020-04-15 08:17', 'yyyy-MM-dd HH:mm'), 6.00 ,'F')
,(7 ,to_timestamp('2020-04-15 10:11', 'yyyy-MM-dd HH:mm'), 7.00 ,'B')
,(8 ,to_timestamp('2020-04-16 08:13', 'yyyy-MM-dd HH:mm'), 1.00 ,'E')
,(9 ,to_timestamp('2020-05-15 08:17', 'yyyy-MM-dd HH:mm'), 6.00 ,'F')
,(10 ,to_timestamp('2019-05-15 08:17', 'yyyy-MM-dd HH:mm'), 6.00 ,'F');
版本:
flink-1.14.4
hudi-0.11.0
spark-3.2.1-bin-hadoop2.7
hadoop-2.10.1
Beta Was this translation helpful? Give feedback.
All reactions