Dynamic FK using SQLAlchemy ORM - orm

In the data_id field of the Product_data table are stored data records id's of bio_data, geo_data or phy_data according to the field dataTable_name of Product table for a prod_id.
So, what I need is that, taking into account the name of the table stored in dataTable_name field (bio_data, geo_data or phy_data), the data_id field in Product_data will point to that correspondent table records.
How can I define this relationships using sqlAlchemy ORM?
My approach was this:
t_Product_data = Table(
'Product_data', metadata,
Column('prod_id', ForeignKey(u'Product.prod_id'), primary_key=True, nullable=False),
Column('data_id', ForeignKey(u'bio_data.bio_id'), primary_key=True, nullable=False),
schema='public'
)
class Product(Base):
__tablename__ = 'Product'
__table_args__ = {u'schema': 'public'}
dataTable_name = Column(String)
prod_id = Column(Integer, primary_key=True)
data = relationship(u'bio_data', secondary=t_Product_data)
This works fine, however, as you can see the declaration is static and works only for the bio_data example.
What I need is to put this to work also for the other data sources (geo_data and phy_data), i don't know, maybe making use of dataTable_name field as a variable, like this:
class Product(Base):
__tablename__ = 'Product'
__table_args__ = {u'schema': 'public'}
dataTable_name = Column(String)
prod_id = Column(Integer, primary_key=True)
data = relationship(dataTable_name, secondary=t_Product_data)
And what should I do with the FK data_id in Product_data table? It is possible to create a dynamic FK, that points to different tables according to the value of the field Product.dataTable_name?

Related

How to set Unique Constraint across multiple tables in django?

We have two models, they are in One to One relationship:
class A(models.Model):
first_field_A = ...
second_field_A = ...
class B(models.Model):
first_field_B = ...
a = models.OneToOneField(A, on_delete=models.CASCADE)
I need to define a Unique Constraint for first_field_A and second_field_A of model A and first_field_B of model B.
Is that even possible?
I have tried this:
class A(models.Model):
...
class Meta:
constraints = [UniqueConstraint(name='unique_constraint',
fields=['first_field_A', 'second_field_A', 'b__first_field_B']
)
]
and I've got this error:
django.core.exceptions.FieldDoesNotExist: A has no field named 'b__first_field_B'
Why we don't have access to fields of related tables?
What is the alternative?

SQLAlchemy how to reflect many-to-many relationships

I have two existing tables with a many-to-many relationship.
I'm trying to query from these tables, but it seems like I'm not able to reflect their many-to-many relationship.
Say I have two models with a reference table like this
from sqlalchemy import Column, Integer, String, ForeignKey
from sqlalchemy.orm import declarative_base, relationship
from sqlalchemy import Table
Base = declarative_base()
family_tree = Table(
"family_tree",
Base.metadata,
Column("brother_id", Integer, ForeignKey("brother.id"), primary_key=True),
Column("sister_id", Integer, ForeignKey("sister.id"), primary_key=True),
)
class Sister(Base):
__tablename__ = 'sister'
id = Column(Integer, autoincrement=True, primary_key=True)
brothers = relationship("Brother", secondary=family_tree, back_populates="sisters",
uselist=True)
class Brother(Base):
__tablename__ = 'brother'
id = Column(Integer, autoincrement=True, primary_key=True)
sisters = relationship("Sister", secondary=family_tree, back_populates="brothers",
uselist=True)
and I've populated all three tables. I now want to access the brothers info of a given Sister table entry.
I've tried to access the sister table like this
class SisterInspector(Base):
__tablename__ = 'sister'
__table_args__ = {'autoload': True}
session.query(SisterInspector).first()
which works fine with
session.query(SisterInspector.id)
but throws an AttributeError when trying
session.query(SisterInspector.brothers)
>>> AttributeError: 'Sister' object has no attribute 'brothers'
I've tried to access my tables with this alternative way
from sqlalchemy.schema import Table, MetaData
meta = MetaData()
meta.reflect(bind=engine)
foo = Table('sister', meta, autoload_with=engine)
print(foo.columns)
>>>> ImmutableColumnCollection(sister.id)
I understand that the relationship is not defined as a column, but
foo
>>>> Table('sister',...)
also makes no mention of said relationship. How do I get the many-to-many relationship out of my database?
I'm not sure what the SisterInspector class is doing for you, my SQLAlchemy implementation doesn't use a special inspector class like that -- and I wonder if the fact that it's using the same table name as your Sisters class is what's causing the issue.
Can you give us the output of something like
first_sister = session.query(Sister).first()
print(first_sister.brothers)

sqlalchemy group by unique values in column

I have a simple model, which looks line this
class Data(Base):
__tablename__ = 'data'
id = Column(Integer, primary_key=True)
sub_id = Column(String)
value = Column(JSONB)
sub_id is not unique value. Is it possible to query data from this table grouping rows by sub_id with value column... something like
((sub_id, (all_rows_with_same_sub_id)), (sub_id, (all_rows_with_same_sub_id))

How can I query data from multiple tables and sort by time?

I want to show a user a feed which would combine two different types of posts, each with a different schema. It seems like a union won't work because they have different columns. I'm using postgres and sqlalchemy
I was thinking of fetching the entries from table1 and then the entries from table2, merging them, and then sorting by the date would be very inefficient.
EDIT:
I want to be able to query for a list of posts and images and have them sorted by created_at.
Posts table and Images table:
Post
title
body
created_at
...many more
Image
img_url
caption
created_at
...many more
One option:
SELECT created_at
FROM (
SELECT created_at
FROM Post
UNION ALL
SELECT created_at
FROM Image
)
ORDER BY created_at
Note that this could also be done without a subquery, but I find it clearer this way and it makes it more convenient to add filters.
Technically the query above is in it's current form the same as:
SELECT created_at
FROM Post
UNION ALL
SELECT created_at
FROM Image
ORDER BY created_at
SQLAlchemy example:
q1 = sess.query(SomeClass).filter(SomeClass.foo=='bar')
q2 = sess.query(SomeClass).filter(SomeClass.bar=='foo')
q3 = q1.union_all(q2).order_by('created_at')
Concrete example with a common ancestor (using an inferred version of your models):
class FeedItem(Base):
id = Column(Integer, primary_key=True)
class Post(FeedItem):
id = Column(Integer, ForeignKey('feed_item.id'), primary_key=True)
title = Column(String(100))
body = Column(Text())
created_at = Column(DateTime())
class Image(FeedItem):
id = Column(Integer, ForeignKey('feed_item.id'), primary_key=True)
img_url = Column(String(200))
caption = Column(Text())
created_at = Column(DateTime())
feed_items = with_polymorphic(FeedItem, [Post, Image])
query = session.query(feed_items).order_by('created_at')

Bulk update children table records set on master table records

I am using SQL Server 2008 R2. I have imported 2 tables from excel and I want to link them together. I looks like this:
Tables imported from Excel
brand (nvarchar(20) name)
models (nvarchar(20) parent, nvarchar(50 name))
Tables after my amends
brand (int ident id, nvarchar(20) name, tinyint status)
models (int ident id, int parent_id,
nvarchar(20) parent, nvarchar(50) name, tinyint status)
As you can see I'd like to link table models using parent_id to table brand using id.
Select is ok, I have done that.
What I need is create bulk update which would put brand id into model parent_id.
Conditions are:
set models.parent_id = brand.id where brand.name = model.parent
I hope it is clear. Basically I want to change linking field model.parent to model.parent_id. There is a possibility that brand.name can change and if that happens table models would be unable to link to correct parent.
And I want to do that in bulk, to go through all the records in brand and update all relevant records in models.
UPDATE
m
SET
parent_id = b.id
FROM
models m
JOIN
brand b ON b.name = m.parent
I'd them assume you want to remove models.parent
ALTER TABLE models DROP COLUMN parent
UPDATE models
SET parent_id = brand.id
FROM brand
WHERE brand.name = models.parent