✨ Support field max_length
Currently, when max_length is specified in a Field, it has no real effect on the corresponding SQL schema (#126, #746). As pointed out by @chris-beedie, this is because the configured max_length is stored as an annotated_types.MaxLen, rather than a PydanticMetadata object, and is therefore ignored in get_field_metadata.
This PR adds a simple extra check for that specific case. I have confirmed it works using PostgreSQL 16.2.
I also wanted to write a test for it, along the following lines:
def test_field_max_length_is_enforced(clear_sqlmodel):
class Hero(SQLModel, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
name: str = Field(max_length=5)
engine = create_engine("sqlite://")
SQLModel.metadata.create_all(engine)
with pytest.raises(IntegrityError):
with Session(engine) as session:
hero_1 = Hero(name="Deadpond")
session.add(hero_1)
session.commit()
session.refresh(hero_1)
Unfortunately, this doesn't work, as sqlite does not actually enforce something like VARCHAR(5).
One solution would be to use a testcontainer (see here) to run a PostgreSQL instance and test against that, but it would significantly increase the complexity & dependencies of the unit tests.
FYI _ stumbled across this trying to debug SQLModel using DB2. The create table SQL ends up using VARCHAR(None) that DB2 rejects; I tried the max_length that didn't work hence, came here.
Not sure if that helps but DB2 doesn't like unbounded varchars
A vote for me as well. This is, for many database centric applications, a significant issue if the max_length for a string does not create a corresponding text(max_length) or varchar(max_length) etc in the table.
I sthere any way this can be escalated, or would you accept an updated pull request from me if I was able to resolve it?