Moreno Garcia e Silva

Results 14 issues of Moreno Garcia e Silva

**Is your feature request related to a problem? Please describe.** Sometimes customers are caught by surprise with API usage and overages. **Describe the solution you'd like** Customers would like to...

**Is your feature request related to a problem? Please describe.** It is more of an oversight. Whenever an Environment is deleted within a project the API still returns calls if...

**Is your feature request related to a problem? Please describe.** Users are frequently working on limited number of features. Our feature list currently is sorted by name ascending. It would...

**Is your feature request related to a problem? Please describe.** Current view shows total requests per type per day. You can see the variation across the week/month but there is...

[DEPRECATION WARNING]: 'include' for playbook includes. You should use 'import_playbook' instead. This feature will be removed in version 2.8. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. [DEPRECATION...

[DEPRECATION WARNING]: The 'ec2_facts' module is being renamed 'ec2_metadata_facts'. This feature will be removed in version 2.7. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.

I would like to report an issue in page https://cpp-driver.docs.scylladb.com/master/topics/basics/batches/index ### Problem There are no examples of batch statements using prepared statements inside, just simple statements. ### Suggest a fix...

1. Allow to pass keyspace and table as command line parameters. 2. Allow to pass multiple tables in the yaml, comma separated. Ex.: tables=table1,table2,table3

It would be good to write "savepoint" parameter in HDFS of the Cluster. For example: savepoints: path: hdfs://... Is it possible to create any flag to choose if I want...

@iravid For arguments to the spark connector like username and password (authentication) Can we still do this? ``` spark-submit --class com.scylladb.migrator.Migrator \ --master spark://:7077 \ --conf spark.cassandra.auth.username="cassandra" --conf spark.cassandra.auth.password="cassandra" --conf...

enhancement