Inference problems with insertions
Hi,
I'm in the process of trying out this library. After a gruelling day and a half spent wrangling my ten tables into beam-migrate and trying to figure out beam-migrate-cli (and staying with autoMigrate in the end), I'm finally in trying to write my first query. I'm translating the core of another application that uses Persistent - what drove me to try this out is the support for mixins, which I'm trying to use to the fullest.
Anyway, the query I'm trying to build is just a simple text fixture for inserting users:
addUsers :: Pg ()
addUsers = do
now <- zonedTimeToLocalTime <$> liftIO getZonedTime
users <- forM ["admin", "user"] $ \u -> do
hash <- liftIO $ T.pack . BS.unpack <$> makePassword (BS.pack $ T.unpack u) 17
return $ User default_ (val_ u) (just_ $ val_ hash) (val_ True) (emptyBlamable Nothing)
runInsert . insert (user starletDb) $ insertExpressions users
The problem I have is with the s (scope?) that insertExpressions is forall'd over and the GHC tells me that it Couldn't match type ‘s’ with ‘s'’ because type variable ‘s'’ would escape its scope.
Trying to explicitly annotate users :: [forall s. UserT (QExpr PgExpressionSyntax s)] <- ... causes the compiler to give up with the error llegal polymorphic type: forall s. QExpr syntax s. GHC doesn't yet support impredicative polymorphism.
The only (ugly) solution I can think of is to use a newtype to hide the s parameter, though I'm not sure that'd work the way I expect it to.
I did manage to make the query work by moving the User construction inside insertExpressions:
addUsers :: Pg ()
addUsers = do
now <- zonedTimeToLocalTime <$> liftIO getZonedTime
users <- forM ["admin", "user"] $ \u -> do
hash <- liftIO $ T.pack . BS.unpack <$> makePassword (BS.pack $ T.unpack u) 17
return (u, hash)
runInsert . insert (user starletDb) $ insertExpressions $ flip fmap users $ \(u, p) ->
User default_ (val_ u) (just_ $ val_ p) (val_ "") (val_ True) (emptyBlamable Nothing)
That solves this particular problem, but I'd like to know how to do it with the first construction (if it's possible to convince the compiler...).
Also, I've stumbled upon another inference-related problem - replacing runInsert $ insert (user starletDb) $ insertExpressions $ flip fmap users $ ... by the nicer point-free version runInsert . insert (user starletDb) . insertExpressions . flip fmap users $ ... causes the compiler to want to infer [UserT (QExpr syntax s)] with a unified s - whereas the immediately applied version keeps the forall s..
I'm sure there is an easy solution to both of these, but while I've played with TypeFamilies et al. quite a bit, I haven't really used this type of nested foralls yet, so I'm reduced to having a one-sided conversation with GHC :)
Hello,
beam-migrate-cli is not yet complete... Sorry! Do not use it, except the simple commands to generate a Haskell schema from a database. I'm still figuring it out, but have been swamped by work commitments and personal matters.
As to your other question:
It is not possible to do what you'd like with the first method, because GHC will force anything that is shuttled through the monadic bind to have a monomorphic type. I believe this restriction cannot be worked around at beams level. You could use an existential type, as you mention, but you'd have to unwrap it. This is the same issue that you'd come across if you were trying to construct a value of type ST s monadically to run later using runST.
You've stumbled upon a nasty case, but this scoping is a feature that's meant to protect you. The reason beam wants it to be existentially qualified over the scope is that otherwise, you could inject a variable from another query that you define elsewhere that is not being executed here.
The second problem you've stumbled across is a well-known bug. GHC actually has special cases for Haskell standard types, like ST, so this catches many Haskellers by surprise, since these types have worked elsewhere. Unfortunately, I don't know of any way to make GHC figure things out for an external library. If you have suggestions for how GHC could work around this, direct them to the GHC issue tracker. I'd be happy to support you. I believe the feature is called 'impredicative polymorphism', and there is some work to add it to GHC.
Sorry for the tone and length of the first post - I've written it up after 12hrs of staring at Haskell code, and I was a bit frustrated...
Anyway, thanks for the speedy reply. It's unfortunate that this happens, but if it's unavoidable and the s's can't be unified, then providing a workaround and documenting it would be an adequate solution. Maybe it's hidden somewhere in the documentation and I just can't find it?
I've played with the types a bit more today, and what worked for me was Data.Functor.Compose and Data.Some - a standalone example in case someone stumbles upon the same problem:
-- from Data.Some
newtype Some tag = This { getThis :: forall t. tag t }
-- from Data.Functor.Compose
newtype Compose f g a = Compose { getCompose :: f (g a) }
addUsers :: Pg ()
addUsers = do
users <- forM ["admin", "user"] $ \u -> do
return $ This $ Compose $ User default_ (val_ u) (just_ $ val_ "") (val_ "")
runInsert . insert (user starletDb) $ insertExpressions $ fmap (getCompose . getThis) users
Or:
newtype Some2 f g = This2 { getThis2 :: forall t. f (g t) }
addUsers :: Pg ()
addUsers = do
users <- forM ["admin", "user"] $ \u -> do
return $ This2 $ User default_ (val_ u) (just_ $ val_ "") (val_ "")
runInsert . insert (user starletDb) $ insertExpressions $ fmap getThis2 users
It's unfortunate that even replacing return $ This $ Compose with return . This . Compose results in more type errors as (.) attempts to remove the forall..
Anyway, I guess I'll bug you with further issues in the future - I'm really enjoying the backend types, especially after playing HKD on my own, but coming from Persistent (which does operate at a slightly different level of abstraction) Beam still feels quite raw.
As an example, queries seem very verbose to me - e.g. runInsert . insert (table db) . insertX $ [...] vs insertX (table db) [...], which is just a matter of combinators/utilities. Maybe it's out of scope but an (optional?) typeclass associating a TableT to a TableEntity TableT value would remove even the (table db) bit if there were wrapper functions that used it - and that would give a choice between using type annotations and TableEntity values - and it might be even derivable using Generic. Of course, that wouldn't work with two or more tables with the same type. I might try implementing sth. like that if I have the time next weekend.
Feel free to open up pull requests with new features. That being said:
Of course, that wouldn't work with two or more tables with the same type
the ability to have two tables of the same type is a necessary feature as far as I'm concerned.
Beam still feels quite raw.
That is the explicit intent here. In general, I won't approve pull requests that try to offer additional abstraction. SQL is a great abstraction for what it attempts to be -- a query language for relational data. Adding further abstraction just confuses things and causes confusion due to leakiness.
As to the verbosity of things like INSERT. Inserting values, inserting expressions, and inserting the results of a subquery are three different ways of getting data. Thus, we need separate functions for each (or some way to distinguish). There are also multiple ways to run an insert statement -- either directly or returning what was inserted. Beam's current representation surfaces these choices to the programmer because these choices are significant. It is incredibly straightforward to define helper functions if you're only inserting values and you want to use something like insertInto <table> [ <values> ].
insertInto tbl values = runInsert $ insert tbl $ insertValues values
I believe this and https://github.com/tathougies/beam/issues/364 are both manifestations of this QExpr type checking problem. My understanding of the problem and what it's portraying to solve is still poor, but it smells like maybe we could dial back the type level magic to make the common case use experience better — inserting new records to a table with an autoincrement field and maybe the odd now(). No expressions defined elsewhere.