[Bug][Blueprints] 400 error when adding Azure DevOps scope to existing project
Search before asking
- [X] I had searched in the issues and found no similar issues.
What happened
On a project with an existing github connection (such as #7503), adding an Azure DevOps connection to it gives a 400 error in the UI:
Inside lake microservice container, the error is the below:
time="2024-05-24 12:40:03" level=error msg="HTTP 400 error\n\tcaused by: error patching the blueprint (400)\n\tWraps: (2)\n\tWraps: (3) fork/exec ./run.sh: argument list too long\n\tWraps: (4) fork/exec ./run.sh\n\tWraps: (5) argument list too long\n\tError types: (1) *hintdetail.withDetail (2) *hintdetail.withDetail (3) *hintdetail.withDetail (4) *fs.PathError (5) syscall.Errno"
[GIN] 2024/05/24 - 12:40:03 | 400 | 4.889660007s | 10.253.240.52 | PATCH "/blueprints/2"
What do you expect to happen
AzureDevops scopes are added to existing project.
How to reproduce
- Create a project
- Successfully add data scopes from GitHub
- Try to add Azure DevOps scopes
Anything else
Same environment where #7503 is happening.
Version
1.0.0-beta6
Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
To resolve the 400 error when adding an Azure DevOps scope to an existing project with a GitHub connection, you need to ensure that the scope details and connection information are correctly mapped and validated. The error message indicates that the argument list is too long, which suggests that the payload size might be an issue.
Here are some steps to address this:
-
Batch Processing: Modify the
PutScopesfunction to handle payloads in smaller batches to prevent the "argument list too long" error. Here is an example of how you might adjust the function:func (gs *GenericScopeApiHelper[Conn, Scope, ScopeConfig]) PutScopes(input *plugin.ApiResourceInput, scopes []*Scope) ([]*ScopeRes[Scope, ScopeConfig], errors.Error) { params, err := gs.extractFromReqParam(input, false) if err != nil { return nil, err } err = gs.dbHelper.VerifyConnection(params.connectionId) if err != nil { return nil, errors.Default.Wrap(err, fmt.Sprintf("error verifying connection for connection ID %d", params.connectionId)) } if len(scopes) == 0 { return nil, nil } err = gs.validatePrimaryKeys(scopes) if err != nil { return nil, err } now := time.Now() batchSize := 100 // Adjust the batch size as needed for i := 0; i < len(scopes); i += batchSize { end := i + batchSize if end > len(scopes) { end = len(scopes) } batch := scopes[i:end] for _, scope := range batch { gs.setScopeFields(scope, params.connectionId, &now, &now) err = gs.verifyScope(scope, gs.validator) if err != nil { return nil, errors.Default.Wrap(err, "error verifying scope") } } gs.setRawDataOrigin(batch...) err = gs.dbHelper.SaveScope(batch) if err != nil { return nil, errors.Default.Wrap(err, "error saving scope") } } apiScopes, err := gs.addScopeConfig(scopes...) if err != nil { return nil, errors.Default.Wrap(err, "error associating scope config to scope") } return apiScopes, nil } -
Scope Mapping: Ensure that the
MapScopeDetailsfunction correctly maps the scope details for both Azure DevOps and GitHub. This function differentiates between scope details by checking the repository type and processing them accordingly.for _, scope := range scopeDetails { azuredevopsRepo, scopeConfig := scope.Scope, scope.ScopeConfig if azuredevopsRepo.Type != models.RepositoryTypeADO { continue } id := didgen.NewDomainIdGenerator(&models.AzuredevopsRepo{}).Generate(connectionId, azuredevopsRepo.Id) if utils.StringsContains(scopeConfig.Entities, plugin.DOMAIN_TYPE_CODE_REVIEW) || utils.StringsContains(scopeConfig.Entities, plugin.DOMAIN_TYPE_CODE) { scopeRepo := code.NewRepo(id, azuredevopsRepo.Name) sc = append(sc, scopeRepo) } if utils.StringsContains(scopeConfig.Entities, plugin.DOMAIN_TYPE_CICD) { scopeCICD := devops.NewCicdScope(id, azuredevopsRepo.Name) sc = append(sc, scopeCICD) } if utils.StringsContains(scopeConfig.Entities, plugin.DOMAIN_TYPE_TICKET) { scopeTicket := ticket.NewBoard(id, azuredevopsRepo.Name) sc = append(sc, scopeTicket) } } for _, scope := range scopeDetails { azuredevopsRepo, scopeConfig := scope.Scope, scope.ScopeConfig if azuredevopsRepo.Type == models.RepositoryTypeADO { continue } id := didgen.NewDomainIdGenerator(&models.AzuredevopsRepo{}).Generate(connectionId, azuredevopsRepo.Id) if utils.StringsContains(scopeConfig.Entities, plugin.DOMAIN_TYPE_CICD) { scopeCICD := devops.NewCicdScope(id, azuredevopsRepo.Name) sc = append(sc, scopeCICD) } if !azuredevopsRepo.IsPrivate && utils.StringsContains(scopeConfig.Entities, plugin.DOMAIN_TYPE_CODE) { scopeRepo := code.NewRepo(id, azuredevopsRepo.Name) sc = append(sc, scopeRepo) } } -
Validation: Ensure that the
PutScopesfunction includes validation checks to verify the scope details and prevent oversized payloads.
By implementing these changes, you should be able to resolve the 400 error and successfully add Azure DevOps scopes to an existing project with a GitHub connection [1][2].
Details
About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
To continue the conversation, mention @dosu.
Would you like to try the AzureDevopsGo plugin? It is more robust and actively maintained by @mr-ks
Will do! What's the difference between the two?
Well, the one without Go suffix is implemented in Python while the Go one is written in Go.
@klesh somehow the repos I can see in the python plugin are not showing up in the go plugin.
Oh, I see. What is the different between those repos? loop @mr-ks
@slaterx, the two plugins use different database tables, did you manage to successfully create the connection in the go plugin?
One important distinction is that the Go version excludes disabled repositories. Is this possibly why you're not seeing some repositories?
https://github.com/apache/incubator-devlake/blob/288c01e3c7b86a0f5beb3e073320ebb69d4f206b/backend/plugins/azuredevops_go/api/remote_helper.go#L150-L155
@mr-ks thank you for explaining the distinction, indeed the difference comes from the disabled repos showing up on the python plugin.
In Azure DevOps land, a disabled repo means that only the source code is unavailable. Pipelines, boards and other items might still be available and actively used by teams (which is our case).
I think we should allow disabled repos because the use case sounds reasonable.
Sounds good. I'll re-enable disabled repositories but exclude them from tasks of the code domain. I think pr‘s and such are not available for disabled repos. What do you think @klesh, @slaterx