So recently I have been working to migrate a Django MySQL to Postgresql. MySQL was set up to be a bit more lenient about data integrity, so I realized that a
unique_together meta was being violated. I had to find all the data were it wasn't unqiue together so I could then figure out how to resolve this.
from collections import defaultdict fields_to_obj = defaultdict(list) # iterate through all the objects, getting or creating # a key in the `fields_to_obj`, based a tuple of the fields # you want to be unique together. for obj in CustomFieldValue.objects.all(): fields_to_obj[(obj.field_id, obj.object_id)].append(obj) # now `fields_to_obj` will contain a mapping of each # unique field pair to a list of objects which have that # pair # to find all the duplicate objects, filter those values # for when there are multiples non_unique_objs = [item for item in fields_to_obj.values() if len(item) > 1]