MySQL:在不破坏外键约束的情况下消除重复行
Posted
技术标签:
【中文标题】MySQL:在不破坏外键约束的情况下消除重复行【英文标题】:MySQL: Eliminating duplicate rows without breaking a foreign key constraint 【发布时间】:2013-12-12 12:34:43 【问题描述】:我有一个充满标准化地址的客户数据库。有重复。
每个用户都创建了自己的记录,并输入了自己的地址。所以我们在用户和地址之间是一对一的关系:
CREATE TABLE `users` (
`UserID` INT UNSIGNED NOT NULL AUTO_INCREMENT,
`Name` VARCHAR(63),
`Email` VARCHAR(63),
`AddressID` INT UNSIGNED,
PRIMARY KEY (`UserID`) USING BTREE
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
CREATE TABLE `addresses` (
`AddressID` INT UNSIGNED NOT NULL AUTO_INCREMENT,
`Duplicate` VARCHAR(1),
`Address1` VARCHAR(63) DEFAULT NULL,
`Address2` VARCHAR(63) DEFAULT NULL,
`City` VARCHAR(63) DEFAULT NULL,
`State` VARCHAR(2) DEFAULT NULL,
`ZIP` VARCHAR(10) DEFAULT NULL,
PRIMARY KEY (`AddressID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
还有数据:
INSERT INTO `users` VALUES
(1, 'Michael', 'michael@email.com', 1),
(2, 'Steve', 'steve@email.com', 2),
(3, 'Judy', 'judy@email.com', 3),
(4, 'Kathy', 'kathy@email.com', 4),
(5, 'Mark', 'mark@email.com', 5),
(6, 'Robert', 'robert@email.com', 6),
(7, 'Susan', 'susan@email.com', 7),
(8, 'Paul', 'paul@email.com', 8),
(9, 'Patrick', 'patrick@email.com', 9),
(10, 'Mary', 'mary@email.com', 10),
(11, 'James', 'james@email.com', 11),
(12, 'Barbara', 'barbara@email.com', 12),
(13, 'Peter', 'peter@email.com', 13);
INSERT INTO `addresses` VALUES
(1, '', '1234 Main Street', '', 'Springfield', 'KS', '54321'),
(2, 'Y', '1234 Main Street', '', 'Springfield', 'KS', '54321'),
(3, 'Y', '1234 Main Street', '', 'Springfield', 'KS', '54321'),
(4, '', '5678 Sycamore Lane', '', 'Upstate', 'NY', '50000'),
(5, '', '1000 State Street', 'Apt C', 'Sunnydale', 'OH', '54321'),
(6, 'Y', '1234 Main Street', '', 'Springfield', 'KS', '54321'),
(7, 'Y', '1000 State Street', 'Apt C', 'Sunnydale', 'OH', '54321'),
(8, 'Y', '1234 Main Street', '', 'Springfield', 'KS', '54321'),
(9, '', '1000 State Street', 'Apt A', 'Sunnydale', 'OH', '54321'),
(10, 'Y', '1234 Main Street', '', 'Springfield', 'KS', '54321'),
(11, 'Y', '5678 Sycamore Lane', '', 'Upstate', 'NY', '50000'),
(12, 'Y', '1000 Main Street', 'Apt A', 'Sunnydale', 'OH', '54321'),
(13, '', '9999 Valleyview', '', 'Springfield', 'KS', '54321');
哦,是的,让我添加外键关系:
ALTER TABLE `users` ADD CONSTRAINT `AddressID`
FOREIGN KEY `AddressID` (`AddressID`)
REFERENCES `addresses` (`AddressID`);
我们的地址列表由第三方服务清理,该服务对数据进行规范化并指出我们在哪里有重复。这就是Duplicate
列的来源。如果有一个“Y”,它是另一个地址的副本。如示例数据所示,主地址未标记为重复。
我显然想删除所有重复的记录,但是有指向它们的用户记录。我需要他们指出不重复的地址版本。
那么如何更新users
中的AddressID
以匹配不重复的地址?
我能想到的唯一方法是使用高级语言遍历所有数据,但我相当确定 mysql 拥有以更好的方式执行此类操作所需的所有工具。
这是我尝试过的:
SELECT COUNT(*) as cnt, GROUP_CONCAT(AddressID ORDER BY AddressID) AS ids
FROM addresses
GROUP BY Address1, Address2, City, State, ZIP
HAVING cnt > 1;
+-----+--------------+
| cnt | ids |
+-----+--------------+
| 2 | 5,7 |
| 6 | 1,2,3,6,8,10 |
| 2 | 4,11 |
+-----+--------------+
3 rows in set (0.00 sec)
从那里,我可以遍历每个结果行并执行以下操作:
UPDATE `users` SET `AddressID` = 1 WHERE `AddressID` IN (2,3,6,8,10);
但必须有更好的 MySQL 专用方法,不是吗?
当一切都说完了,数据应该是这样的:
SELECT * FROM `users`;
+--------+---------+-------------------+-----------+
| UserID | Name | Email | AddressID |
+--------+---------+-------------------+-----------+
| 1 | Michael | michael@email.com | 1 |
| 2 | Steve | steve@email.com | 1 |
| 3 | Judy | judy@email.com | 1 |
| 4 | Kathy | kathy@email.com | 4 |
| 5 | Mark | mark@email.com | 5 |
| 6 | Robert | robert@email.com | 1 |
| 7 | Susan | susan@email.com | 5 |
| 8 | Paul | paul@email.com | 1 |
| 9 | Patrick | patrick@email.com | 9 |
| 10 | Mary | mary@email.com | 1 |
| 11 | James | james@email.com | 4 |
| 12 | Barbara | barbara@email.com | 1 |
| 13 | Peter | peter@email.com | 13 |
+--------+---------+-------------------+-----------+
13 rows in set (0.00 sec)
SELECT * FROM `addresses`;
+-----------+-----------+--------------------+----------+-------------+-------+-------+
| AddressID | Duplicate | Address1 | Address2 | City | State | ZIP |
+-----------+-----------+--------------------+----------+-------------+-------+-------+
| 1 | | 1234 Main Street | | Springfield | KS | 54321 |
| 4 | | 5678 Sycamore Lane | | Upstate | NY | 50000 |
| 5 | | 1000 State Street | Apt C | Sunnydale | OH | 54321 |
| 9 | | 1000 State Street | Apt A | Sunnydale | OH | 54321 |
| 13 | | 9999 Valleyview | | Springfield | KS | 54321 |
+-----------+-----------+--------------------+----------+-------------+-------+-------+
5 rows in set (0.00 sec)
帮助?
【问题讨论】:
【参考方案1】:用户和地址之间存在多对一的关系(即多个用户可以映射到同一个地址)。这对我来说似乎有点奇怪,但我想它可能很有用。多对多会更有意义,即一个用户可以有多个地址,但同一个地址可以由多个用户共享。通常,单个用户有多个地址。更新架构可能会有所帮助,但我离题了。
UPDATE users
-- We only care about users mapped to duplicate addresses
JOIN addresses dupe ON (users.AddressID = dupe.AddressID AND dupe.Duplicate='Y')
-- If your normalizer thingy worked right, these will all map to non-duplicates
JOIN addresses nondupe ON (dupe.Address1 = nondupe.Address1
-- Compare to other columns if you want
AND nondupe.Duplicate = '')
-- Set to the nondupe ID
SET users.AddressID = nondupe.AddressID;
http://sqlfiddle.com/#!2/5d303/1
【讨论】:
【参考方案2】:选择您想查看的结果:
SELECT a.UserID
,a.Name
,a.Email
,(
SELECT addressID
FROM addresses c
WHERE c.Address1 = b.Address1
AND c.Address2 = b.Address2
AND c.City = b.City
AND c.State = b.State
AND c.ZIP = b.ZIP
AND DUPLICATE != 'Y'
) as AddressID
FROM users a
JOIN addresses b
ON a.AddressID = b.AddressID
这会将用户表更新为上面查询中显示的结果。
UPDATE users a
JOIN addresses b
ON a.AddressID = b.AddressID
SET a.addressID =
(
SELECT addressID
FROM addresses c
WHERE c.Address1 = b.Address1
AND c.Address2 = b.Address2
AND c.City = b.City
AND c.State = b.State
AND c.ZIP = b.ZIP
AND Duplicate != 'Y'
)
WHERE Duplicate = 'Y'
请注意,对于您提供的示例数据,#12 Barbara 的 ID 在 SELECT
查询中为空,因为她的地址被标记为重复,而实际上它对于所提供的列表是唯一的。它与“应如何显示”结果中所示的地址 1 不匹配。
编辑
为了处理不正确的重复标志,如 #12 Barbara,或者其他未标记的重复标志,您可以跳过重复标志列检查,只需在子查询上使用 ORDER BY
和 LIMIT
这样它就会返回第一个最低匹配的地址 ID,而不管重复标志是什么:
UPDATE users a
JOIN addresses b
ON a.AddressID = b.AddressID
SET a.addressID =
(
SELECT addressID
FROM addresses c
WHERE c.Address1 = b.Address1
AND c.Address2 = b.Address2
AND c.City = b.City
AND c.State = b.State
AND c.ZIP = b.ZIP
ORDER BY c.addressID ASC
LIMIT 1
)
【讨论】:
以上是关于MySQL:在不破坏外键约束的情况下消除重复行的主要内容,如果未能解决你的问题,请参考以下文章