[英]update statement with join and where clause
Working with some update and join statements from a previous code and trying to use spark sql statements to get the same result table 1使用之前代码中的一些更新和连接语句并尝试使用 spark sql 语句来获得相同的结果表 1
insert into tab1
VALUES(1, 'A1', 'B1', 1),
(2, 'A2', 'B2', 0),
(3, 'A3', 'B3',1 ),
(4, 'A4', 'B4',1 ),
(5, 'A5', 'B5',0 ),
(6, 'A6', 'B6',1 )
;
table 2表 2
insert into tab2
VALUES(1, 'A1', 'B1', 0),
(2, 'A2', 'B2', 1),
(3, 'A3', 'B3', 1),
(6, 'A6', 'B6', 0)
;
update statement更新声明
update tab1
set v1 = concat(t1.v1,t2.v1)
from tab1 t1
inner join tab2 t2 on t1.id =t2.id
where t2.v3 > 0
Result table 1结果表1
1 A2A2 B1 1
2 A2A2 B2 0
3 A2A2 B3 1
4 A2A2 B4 1
5 A2A2 B5 0
6 A2A2 B6 1
Any idea why its not知道为什么不
1 A1 B1 1
2 A2A2 B2 0
3 A3A3 B3 1
4 A4 B4 1
5 A5 B5 0
6 A6 B6 1
Get rid of the tab1
in the FROM
clause and place tab2
instead.去掉
FROM
子句中的tab1
并改为放置tab2
。 You can sort of do the join in the WHERE
clause:您可以在
WHERE
子句中进行连接:
UPDATE tab1 t1
SET v1 = concat(t1.v1,t2.v1)
FROM tab2 t2
WHERE t1.id =t2.id AND t2.v3 > 0;
Demo: db<>fiddle
演示:
db<>fiddle
Turns out the previous code was running on MSSQL with the same syntax it gives the expected results after trying in a MSSQL server原来之前的代码在 MSSQL 上运行,其语法与在 MSSQL 服务器中尝试后给出的预期结果相同
1 A1 B1 1
2 A2A2 B2 0
3 A3A3 B3 1
4 A4 B4 1
5 A5 B5 0
6 A6 B6 1
in Postgresql the from_item as mentioned in the comment of this question Must not contain the same updated table!在 Postgresql 这个问题的评论中提到的 from_item 不能包含相同的更新表!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.