I have a model A
which contains a generic foreign key relation with limit choices to 3 other models(consider them as B
, C
and D
) in the same app. And I know the limitations of generic foreign keys that we can't use filter
or get
or anyother queryset operations.
So to achieve something like this, A.objects.filter(generic_object__name="foo")
I have to filter B, C and D's objects first as queryset, iterate over them and use the generic reverse relation to get the A
objects as list(not queryset).
I'm not sure about how it'll affect the SQL performace on database as the querying is not direct.
PS: I need to use the generic foreignkeys, so please suggest for any SQL improvement rather than redesigning of models.
Using Django 1.4.3 and Postgres.
I'd like to quote some words from David Cramer: developer of Disqus, Django commiter
Generic relations are fine. They are not slow, just more difficult to manage in your code base.
I saw many people tell others don't use generic relations because it's slow, but never tell how it's slow.
Avoid Django's GenericForeignKey has a good and thorough description of the database design antipatterns involved in generic foreign keys (or "polymorphic associations," as they call them in rails-speak).
As for performance, it takes 3 database queries every time you want to retrieve the related GenericForeignKey resource from your model:
model + _ + app_label
TABLE_NAME
; When people say that Generic Foreign Keys have a performance penalty, they are referencing this query overhead.
There are only a very narrow set of circumstances under which you really want to use generic foreign keys. The above-linked article discusses those, as well.
Add an index_together Meta option to your model:
class Meta:
index_together = [('cprofile_id', 'cprofile_type')]
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.