Théo LEBRUN

Results 12 comments of Théo LEBRUN

I had to update the value of `log-path` in my dbt_project.yml (https://docs.getdbt.com/reference/project-configs/log-path) with something like `/usr/local/airflow/tmp/logs` in order to run on AWS MWAA.

Hey @prakash260, Try updating the `target-path` property too (https://docs.getdbt.com/reference/project-configs/target-path) with `/usr/local/airflow/tmp/target` for example. Maybe there is a better way rather than using a temp folder like disabling dbt logs/target generation.

Hey @Gatsby-Lee, I agree with @maker100, you should avoid running heavy process like DBT directly on MWAA. My Airflow DAG triggers an ECS task that runs on Fargate to run...

1: I used this operator to trigger my ECS task https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/operators/ecs.html 2: Yes, I have a Dockerfile that uses an image with dbt (https://hub.docker.com/r/fishtownanalytics/dbt) and then my dbt code is...

I know this issue is closed, I'm only interested into the `userName` column as I think it would be cool to have it for audit purpose. `userMetadata` can be used...

@zsxwing @allisonport-db Not sure if you saw my previous message.

Hello @joaquimsage, Yes correct! The ECS Airflow operator can be used to run your task definition on your ECS cluster (use Fargate so you don't have to manage EC2s). One...

> the initial error was that a KeyError removing job.id from self._running_jobs I "fixed" that by doing `self._running_jobs.discard(job.id)` instead so it doesn't fail if the key is missing. Not a...

I was able to have custom modules by having the modules in the folder `dbt_modules` in my dbt project. Then the whole dbt project is in the `dags` folder on...

@nchammas Try using a SQL warehouse instead, that sounds easier than using inline parameters.