datasette icon indicating copy to clipboard operation
datasette copied to clipboard

google cloudrun updated their limits on maxscale based on memory and cpu count

Open fgregg opened this issue 3 years ago • 1 comments

if you don't set an explicit limit on container scaling, then google defaults to 100

google recently updated the limits on container scaling, such that if you set up datasette to use more memory or cpu, then you need to set the maxScale argument much smaller than 100.

would be nice if datasette publish could do this math for you and set the right maxScale.

Log of an failing publish run.

fgregg avatar Aug 10 '22 13:08 fgregg

maybe a simpler solution is to set the maxscale to like 2? since datasette is not set up to make use of container scaling anyway?

fgregg avatar Aug 10 '22 13:08 fgregg

I'm going to default maxscale to 2 but provide an extra command line option for datasette publish cloudrun which lets people set it to something higher if they want to.

simonw avatar Aug 14 '22 16:08 simonw

Actually I disagree that Datasette isn't setup to make use of container scaling: part of the idea behind the Baked Data pattern is that you can scale to handle effectively unlimited traffic by running multiple copies of your application, each with their own duplicate copy of the database.

So I'm going to default maxScale to 10 and still let people customize it.

simonw avatar Aug 14 '22 16:08 simonw

Here's the relevant part of the datasette publish command from that failed Actions workflow:

  datasette publish cloudrun f7.db nlrb.db opdr.db old_nlrb.db voluntary_recognitions.db work_stoppages.db lm20.db chips.db \
    --memory 8Gi \
    --cpu 2

simonw avatar Aug 14 '22 16:08 simonw

Tried duplicating this error locally but the following command succeeded when I expected it to fail:

datasette publish cloudrun fixtures.db --memory 8Gi --cpu 2 --service issue-1779

simonw avatar Aug 14 '22 16:08 simonw

Maybe I need to upgrade:

% gcloud --version
Google Cloud SDK 378.0.0
alpha 2022.03.18
bq 2.0.74
core 2022.03.18
gsutil 5.8
Updates are available for some Google Cloud CLI components.  To install them,
please run:
  $ gcloud components update
% gcloud components update
Beginning update. This process may take several minutes.


Your current Google Cloud CLI version is: 378.0.0
You will be upgraded to version: 397.0.0

┌─────────────────────────────────────────────────────────────────────────────┐
│                      These components will be updated.                      │
├─────────────────────────────────────────────────────┬────────────┬──────────┤
│                         Name                        │  Version   │   Size   │
├─────────────────────────────────────────────────────┼────────────┼──────────┤
│ BigQuery Command Line Tool                          │     2.0.75 │  1.6 MiB │
│ BigQuery Command Line Tool (Platform Specific)      │     2.0.75 │  < 1 MiB │
│ Cloud Storage Command Line Tool                     │       5.11 │ 15.5 MiB │
│ Cloud Storage Command Line Tool (Platform Specific) │       5.11 │  < 1 MiB │
│ Google Cloud CLI Core Libraries                     │ 2022.08.05 │ 24.3 MiB │
│ Google Cloud CLI Core Libraries (Platform Specific) │ 2022.08.05 │  < 1 MiB │
│ anthoscli                                           │     0.2.28 │ 48.0 MiB │
│ gcloud Alpha Commands                               │ 2022.08.05 │  < 1 MiB │
│ gcloud cli dependencies                             │ 2022.07.29 │ 11.2 MiB │
└─────────────────────────────────────────────────────┴────────────┴──────────┘

A lot has changed since your last upgrade.  For the latest full release notes,
please visit:
  https://cloud.google.com/sdk/release_notes
% gcloud --version        
Google Cloud SDK 397.0.0
alpha 2022.08.05
bq 2.0.75
core 2022.08.05
gsutil 5.11

simonw avatar Aug 14 '22 16:08 simonw

datasette publish cloudrun fixtures.db --memory 8Gi --cpu 2 --service issue-1779 still works.

simonw avatar Aug 14 '22 16:08 simonw

Just spotted this in the failing Actions workflow:

gcloud config set run/region us-central1

I tried that locally too but the deploy still succeeds.

simonw avatar Aug 14 '22 16:08 simonw

Just tried this instead, and it still worked and deployed OK:

datasette publish cloudrun fixtures.db --memory 16Gi --cpu 4 --service issue-1779

@fgregg I'm not able to replicate your deployment failure I'm afraid.

simonw avatar Aug 14 '22 16:08 simonw

(I deleted my issue-1779 project using the UI at https://console.cloud.google.com/run?project=datasette-222320)

simonw avatar Aug 14 '22 16:08 simonw

Here's the start of the man page for gcloud run deploy:

NAME
    gcloud run deploy - deploy a container to Cloud Run

SYNOPSIS
    gcloud run deploy [[SERVICE] --namespace=NAMESPACE] [--args=[ARG,...]]
        [--async] [--command=[COMMAND,...]] [--concurrency=CONCURRENCY]
        [--cpu=CPU] [--ingress=INGRESS; default="all"]
        [--max-instances=MAX_INSTANCES] [--memory=MEMORY]
        [--min-instances=MIN_INSTANCES]

I'm going to expose --max-instances and --min-instances as extra option to datasette publish cloudrun.

simonw avatar Aug 14 '22 16:08 simonw

Tested that with:

datasette publish cloudrun fixtures.db --service issue-1779 --min-instances 2 --max-instances 4
image

simonw avatar Aug 14 '22 17:08 simonw

thanks @simonw!

fgregg avatar Aug 14 '22 19:08 fgregg