salt icon indicating copy to clipboard operation
salt copied to clipboard

Rewrite vault core, issue AppRoles to minions

Open lkubb opened this issue 3 years ago • 4 comments

What does this PR do?

Sorry for this huge PR, it is mostly intended to inquire if there is interest to get this merged and get feedback. It definitely needs a lot of tests.

  • Fundamentally rewrites the Vault integration core and provides higher-level abstractions to interact with Vault.
  • Issues AppRoles to minions and manages their metadata, allowing much simpler ACL policies.
  • Makes use of response wrapping to distribute secrets.
  • Uses Salt cache classes instead of files directly.
  • Changes the configuration format, the old format is translated automatically.
  • Corrects some docs, adds configuration examples and required policies.
  • Removes __utils__ use.

Background

While working on https://github.com/saltstack/salt/pull/62674, I noticed the better approach would be to issue AppRoles to minions. Implementing that, I was a bit frustrated with the abstraction level and found myself in a yak shaving situation. The missing abstraction causes issues such as https://github.com/saltstack/salt/issues/62651.

Builds on https://github.com/saltstack/salt/pull/62674. See also: https://discuss.hashicorp.com/t/saltstack-vault-and-host-role-policies/19214

What issues does this PR fix or reference?

Fixes: https://github.com/saltstack/salt/issues/62380 https://github.com/saltstack/salt/issues/58174 https://github.com/saltstack/salt/pull/62552 https://github.com/saltstack/salt/pull/59827

Likely: https://github.com/saltstack/salt/issues/60779 https://github.com/saltstack/salt/issues/57561

Previous Behavior

  • Adding new Vault behavior is cumbersome.
  • The master fetches tokens in plain text and sends them to minions, which are unknown to Vault.
  • To securely assign ACL policies to issued tokens, you need to create a separate policy for each minion. This would be relieved a bit by the mentioned pillar templating PR, so that a separate policy is only necessary for each defined role.

New Behavior

  • Adding new Vault behavior has a much lower entry barrier and potential for mistakes.
  • Secrets are distributed via response wrapping tokens, ensuring integrity and secrecy. The master can be configured to issue AppRoles and entities to minions. Vault knows about them and associated pillars.
  • In theory, a single policy for all salt minions is sufficient for most behaviors:
path "salt/data/minions/{{identity.entity.metadata.minion-id}}" {
    capabilities = ["read"]  # change capabilities as necessary
}

path "salt/data/roles/{{identity.entity.metadata.role}}" {
    capabilities = ["read"]
}

# ... match other custom metadata that is managed by Salt

Merge requirements satisfied?

[NOTICE] Bug fixes or features added to Salt require tests.

  • [x] Docs
  • [ ] Changelog - https://docs.saltproject.io/en/master/topics/development/changelog.html
  • [ ] Tests written/updated

Commits signed with GPG?

Yes

lkubb avatar Sep 14 '22 07:09 lkubb

From the looks of it this is breaking change right? Meaning that people that currently use Vault integration will need to adjust at least their configuration (or worse) to get it working again.

If that's the case, I wonder whether it wouldn't make sense deprecating the current modules and provide this as "v2" or similar.

lukasraska avatar Sep 16 '22 06:09 lukasraska

The intention is for this to be backwards-compatible. The current configuration scheme is translated and in case the master is upgraded before minions, the current runner endpoint is still available, same as with the current public utility functions in case custom modules are using those. They are marked as deprecated though.

lkubb avatar Sep 18 '22 17:09 lkubb

~~Okay, this will be ready for review very soon. Some notes:~~

  • ~~For the docs to build, pygments>=2.9.0 has to be included in requirements/ci/docs.in (I assume, lexing works with the current version, 2.9 includes Terraform 0.14 syntax support). It is unclear to me how to rebuild the requirement files with pip-compile correctly.~~ ~~It seems even pygments==2.3.0 does not throw an error, not sure what's happening here atm.~~
  • ~~I am not sure how to best add the improvements in this PR to the changelog. Should I open a feature request to obtain an issue ID?~~
  • ~~The failing test test_loader.test_extension_discovery_without_reload_with_importlib_metadata_installed seems to be an effect of importlib_metadata v5 being released, which removes entry_points() -> dict~~
  • ~~This PR builds on the pillar templating one mentioned in the entry post, so some of the included changes are described there.~~

Thanks for your hard work on Salt. :) Hope I can contribute something back with this. Imho Vault is a great fit with Salt and promises much different benefits and tradeoffs than the usual pillar files. Also sorry once again for the LOC in this PR, I wasn't able to come up with a simple way to break this part up into multiple ones. Cheers!

lkubb avatar Oct 04 '22 22:10 lkubb

I am noticing that the config file changes in the PR. things are moved around etc. but you said earlier that they should still work with the old style items and things like url being in the old style. were in the new one url belongs under server:. if it will work with the old style I'm good with that. one big change i am seeing is that the items in peer_run have changed. and there are more options in the new style. does it work with just generate_token or are all of them needed?

other wise you do need change logs for all of the issues listed in you fixes. i see only three are current covered. As for the request about added functionality yes a feature request issue should be added for each big change. so that you can get the changelog id. hopefully nothing serious that might need a full SEP is required.

whytewolf avatar Oct 05 '22 16:10 whytewolf

I am noticing that the config file changes in the PR. things are moved around etc. but you said earlier that they should still work with the old style items and things like url being in the old style. were in the new one url belongs under server:. if it will work with the old style I'm good with that.

The configuration in use by the components is parsed (from the opts dict) here, which makes sure the old style is correctly translated internally.

one big change i am seeing is that the items in peer_run have changed. and there are more options in the new style. does it work with just generate_token or are all of them needed?

The required vault configuration values have not changed. You are making a good point about needing updated peer_run configuration.

Updated minions will no longer attempt to use vault.generate_token, but vault.generate_new_token and vault.get_config. vault.generate_token is the old token-issuing endpoint and returns data in the same format as before, but will log a deprecation warning in the master log, making sure minions running older versions are still supported. I will need to add a fallback to the old endpoint for updated minions to account for unchanged peer_run configuration.

Edit: This has been added. With this commit, it is still advisable to update the peer_run configuration since updated minions default to vault.get_config and only fall back in case they receive no return value. The separate endpoints for config and auth details are intended to reduce unnecessary resource usage.

When configured for AppRole issuance, vault.generate_secret_id and vault.get_config are necessary, but this is a conscious switch.

other wise you do need change logs for all of the issues listed in you fixes. i see only three are current covered. As for the request about added functionality yes a feature request issue should be added for each big change. so that you can get the changelog id. hopefully nothing serious that might need a full SEP is required.

Will do. I hope that as well. :) I read about the SEP procedure, but remember concluding it would probably not be necessary.

Thanks for your feedback!

lkubb avatar Oct 05 '22 17:10 lkubb

Hi! I'm your friendly PR bot!

You might be wondering what I'm doing commenting here on your PR.

Yes, as a matter of fact, I am...

I'm just here to help us improve the documentation. I can't respond to questions or anything, but what I can do, I do well!

Okay... so what do you do?

I detect modules that are missing docstrings or "CLI Example" on existing docstrings! When I was created we had a lot of these. The documentation for these modules need some love and attention to make Salt better for our users.

So what does that have to do with my PR?

I noticed that in this PR there are some files changed that have some of these issues. So I'm leaving this comment to let you know your options.

Okay, what are they?

Well, my favorite, is that since you were making changes here I'm hoping that you would be the most familiar with this module and be able to add some other examples or fix any of the reported issues.

If I can, then what?

Well, you can either add them to this PR or add them to another PR. Either way is fine!

Well... what if I can't, or don't want to?

That's also fine! We appreciate all contributions to the Salt Project. If you can't add those other examples, either because you're too busy, or unfamiliar, or you just aren't interested, we still appreciate the contributions that you've made already.

Whatever approach you decide to take, just drop a comment here letting us know!

Detected Issues (click me)
Check Known Missing Docstrings...........................................Failed
- hook id: invoke
- exit code: 1

/home/runner/.cache/pre-commit/repoyety413h/py_env-python3/lib/python3.9/site-packages/_distutils_hack/init.py:33: UserWarning: Setuptools is replacing distutils. warnings.warn("Setuptools is replacing distutils.") The function 'read_secret' on 'salt/modules/vault.py' does not have a 'CLI Example:' in its docstring Found 1 errors


Thanks again!

github-actions[bot] avatar Oct 06 '22 16:10 github-actions[bot]

Hi! I'm your friendly PR bot!

You might be wondering what I'm doing commenting here on your PR.

Yes, as a matter of fact, I am...

I'm just here to help us improve the documentation. I can't respond to questions or anything, but what I can do, I do well!

Okay... so what do you do?

I detect modules that are missing docstrings or "CLI Example" on existing docstrings! When I was created we had a lot of these. The documentation for these modules need some love and attention to make Salt better for our users.

So what does that have to do with my PR?

I noticed that in this PR there are some files changed that have some of these issues. So I'm leaving this comment to let you know your options.

Okay, what are they?

Well, my favorite, is that since you were making changes here I'm hoping that you would be the most familiar with this module and be able to add some other examples or fix any of the reported issues.

If I can, then what?

Well, you can either add them to this PR or add them to another PR. Either way is fine!

Well... what if I can't, or don't want to?

That's also fine! We appreciate all contributions to the Salt Project. If you can't add those other examples, either because you're too busy, or unfamiliar, or you just aren't interested, we still appreciate the contributions that you've made already.

Whatever approach you decide to take, just drop a comment here letting us know!

Detected Issues (click me)
Check Known Missing Docstrings...........................................Failed
- hook id: invoke
- exit code: 1

/home/runner/.cache/pre-commit/repoyety413h/py_env-python3/lib/python3.9/site-packages/_distutils_hack/init.py:33: UserWarning: Setuptools is replacing distutils. warnings.warn("Setuptools is replacing distutils.") The function 'read_secret' on 'salt/modules/vault.py' does not have a 'CLI Example:' in its docstring Found 1 errors


Thanks again!

github-actions[bot] avatar Oct 06 '22 16:10 github-actions[bot]

just waiting for tests to pass. but so far looking good. Also thank you so much this looks like an awesome edition to salt. I was wondering if you could write up a general piece of information about this change. As I would like to feature it as one of the big changes coming in 3006.

whytewolf avatar Oct 20 '22 16:10 whytewolf

Happy to see this going forward and appreciated. I have been using it for the last couple of weeks and am [shamelessly] quite satisfied with the results of all that work so far. There are a couple of improvements in the back of my head that I will try to code up and get merged as well after this huge PR, one in particular regarding the vault sdb module. I've also written execution/state modules that interface with the vault PKI backend, which would be a nice addition to the existing x509 modules imho. They still need polishing though.

I would be happy to write something up to help people grok what this change is about. Can you @whytewolf link me to some example so I have a reference of the style and scope?

Regarding the failing tests, I'm unsure if they are related or if this PR just triggers all the flaky tests/uses too much memory by modifying tests/pytests/conftest.py (which I tend to, since some of the test runs seem to get killed and in others, there are daemon startup issues). There seems to be no distinct pattern of which tests fail. Maybe someone with more experience can judge what they are about.

lkubb avatar Oct 21 '22 09:10 lkubb

Happy to see this going forward and appreciated. I have been using it for the last couple of weeks and am [shamelessly] quite satisfied with the results of all that work so far. There are a couple of improvements in the back of my head that I will try to code up and get merged as well after this huge PR, one in particular regarding the vault sdb module. I've also written execution/state modules that interface with the vault PKI backend, which would be a nice addition to the existing x509 modules imho. They still need polishing though.

I would be happy to write something up to help people grok what this change is about. Can you @whytewolf link me to some example so I have a reference of the style and scope?

Regarding the failing tests, I'm unsure if they are related or if this PR just triggers all the flaky tests/uses too much memory by modifying tests/pytests/conftest.py (which I tend to, since some of the test runs seem to get killed and in others, there are daemon startup issues). There seems to be no distinct pattern of which tests fail. Maybe someone with more experience can judge what they are about.

for an example of what I am talking about for a quick bit about the changes i would point to the release notes https://docs.saltproject.io/en/latest/topics/releases/3005.html the bit at the top that go into the major changes.

as for the tests it does look like flaky tests. most likely triggered by the conf change only because it means almost all of the tests will be run now. there are changes coming while we work on those.

whytewolf avatar Oct 21 '22 16:10 whytewolf

so, we have been getting these windows test fails on this. and i was thinking they were flaky tests to begin with. but we keep running into them. I'm wondering if your changes to the master configuration is causing the issues. all of the tests that are failing are dealing with minion/master or minion/multimaster communication. @lkubb can you look into these failures closer.

whytewolf avatar Dec 06 '22 21:12 whytewolf

@whytewolf From what I can tell after reading through an immense amount of logs, it seems the errors really are caused by running out of memory. I'm not sure about the exact cause though.

Possible reasons why it might be as pronounced as it is in this PR:

  • vault_container_version fixture is session-scoped, so 3 containers keep running in the background (I will try to change this to module scope) [this should not be a problem on Windows since the container tests are skipped]
  • many of the new Vault tests spin up temporary masters/minions, which might linger (? see second part, but unlikely, since ~~most~~ all of them only start with the container)

I will a) make the vault container fixture module scoped b) remove it from tests/pytests/conftest.py and import it where it is needed c) revert all other changes to tests/pytests/conftest.py (it will contain the old configuration structure)

and see if that helps by reducing the amount of tests that are run.

Windows multimaster tests:

The mm-failover-master-1 (and 2) configuration does not contain any of my changes, so I would rule that out:

22:56:03,661 [saltfactories.bases                                                             :617 ][DEBUG   ][MainProcess(4028)] Writing to configuration file C:\Windows\Temp\stsuite\mm-failover-master-1\conf\master. Configuration:
{'api_logfile': 'logs/api.log',
 'api_pidfile': 'run/api.pid',
 'cachedir': 'cache',
 'enable_legacy_startup_events': False,
 'engines': ['pytest'],
 'file_buffer_size': 8192,
 'file_roots': {'base': ['C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\state-tree\\base'],
                'prod': ['C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\state-tree\\prod']},
 'fileserver_backend': ['roots'],
 'fileserver_list_cache_time': 0,
 'id': 'mm-failover-master-1',
 'interface': '127.0.0.1',
 'key_logfile': 'logs/key.log',
 'log_file': 'logs/master.log',
 'log_fmt_console': '%(asctime)s,%(msecs)03.0f '
                    '[%(name)-17s:%(lineno)-4d][%(levelname)-8s][%(processName)18s(%(process)d)] '
                    '%(message)s',
 'log_fmt_logfile': '[%(asctime)s,%(msecs)03.0f][%(name)-17s:%(lineno)-4d][%(levelname)-8s][%(processName)18s(%(process)d)] '
                    '%(message)s',
 'log_level_logfile': 'debug',
 'master_sign_pubkey': True,
 'max_open_files': 10240,
 'open_mode': True,
 'order_masters': False,
 'peer': {'.*': ['test.*']},
 'pidfile': 'run/master.pid',
 'pillar_opts': False,
 'pillar_roots': {'base': ['C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\pillar-tree\\base'],
                  'prod': ['C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\pillar-tree\\prod']},
 'pki_dir': 'pki',
 'publish_port': 63006,
 'pytest-master': {'log': {'host': '127.0.0.1',
                           'level': 'debug',
                           'port': 49793,
                           'prefix': "SaltMaster(id='mm-failover-master-1')"},
                   'master-id': None,
                   'returner_address': 'tcp://127.0.0.1:49792'},
 'ret_port': 63007,
 'root_dir': 'C:\\Windows\\Temp\\stsuite\\mm-failover-master-1',
 'sock_dir': 'run/master',
 'tcp_master_pub_port': 63008,
 'tcp_master_publish_pull': 63010,
 'tcp_master_pull_port': 63009,
 'tcp_master_workers': 63011,
 'token_dir': 'tokens',
 'token_file': 'C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\ksfjhdgiuebfgnkefvsikhfjdgvkjahcsidk',
 'transport': 'zeromq',
 'user': 'EC2AMAZ-0TPMPCB\\Administrator'}

A diff with the output of another PR with passing tests:

@@ -28,21 +28,21 @@
 'pillar_roots': {'base': ['C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\pillar-tree\\base'],
                  'prod': ['C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\pillar-tree\\prod']},
 'pki_dir': 'pki',
- 'publish_port': 63006,
+ 'publish_port': 62349,
 'pytest-master': {'log': {'host': '127.0.0.1',
                           'level': 'debug',
-                           'port': 49793,
+                           'port': 49771,
                           'prefix': "SaltMaster(id='mm-failover-master-1')"},
                   'master-id': None,
-                   'returner_address': 'tcp://127.0.0.1:49792'},
- 'ret_port': 63007,
+                   'returner_address': 'tcp://127.0.0.1:49770'},
+ 'ret_port': 62350,
 'root_dir': 'C:\\Windows\\Temp\\stsuite\\mm-failover-master-1',
 'sock_dir': 'run/master',
- 'tcp_master_pub_port': 63008,
- 'tcp_master_publish_pull': 63010,
- 'tcp_master_pull_port': 63009,
- 'tcp_master_workers': 63011,
+ 'tcp_master_pub_port': 62351,
+ 'tcp_master_publish_pull': 62353,
+ 'tcp_master_pull_port': 62352,
+ 'tcp_master_workers': 62354,
 'token_dir': 'tokens',
 'token_file': 'C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\ksfjhdgiuebfgnkefvsikhfjdgvkjahcsidk',
 'transport': 'zeromq',
- 'user': 'EC2AMAZ-0TPMPCB\\Administrator'}
+ 'user': 'EC2AMAZ-NCQ5AGA\\Administrator'}

Same goes for the minion config:

22:56:48,473 [saltfactories.bases                                                             :617 ][DEBUG   ][MainProcess(4028)] Writing to configuration file C:\Windows\Temp\stsuite\mm-failover-minion-1\conf\minion. Configuration:
{'acceptance_wait_time': 0.5,
'acceptance_wait_time_max': 5,
'cachedir': 'cache',
'enable_legacy_startup_events': False,
'engines': ['pytest'],
'file_roots': {'base': ['C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\state-tree\\base'],
               'prod': ['C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\state-tree\\prod']},
'id': 'mm-failover-minion-1',
'interface': '127.0.0.1',
'log_file': 'logs/minion.log',
'log_fmt_console': '%(asctime)s,%(msecs)03.0f '
                   '[%(name)-17s:%(lineno)-4d][%(levelname)-8s][%(processName)18s(%(process)d)] '
                   '%(message)s',
'log_fmt_logfile': '[%(asctime)s,%(msecs)03.0f][%(name)-17s:%(lineno)-4d][%(levelname)-8s][%(processName)18s(%(process)d)] '
                   '%(message)s',
'log_level_logfile': 'debug',
'loop_interval': 0.05,
'master': ['127.0.0.1:63007', '127.0.0.2:63007'],
'master_alive_interval': 10,
'master_port': 63007,
'master_tries': -1,
'master_type': 'failover',
'pidfile': 'run/minion.pid',
'pillar_roots': {'base': ['C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\pillar-tree\\base'],
                 'prod': ['C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\pillar-tree\\prod']},
'pki_dir': 'pki',
'publish_port': 63006,
'pytest-minion': {'log': {'host': '127.0.0.1',
                          'level': 'debug',
                          'port': 49793,
                          'prefix': "SaltMinion(id='mm-failover-minion-1')"},
                  'master-id': 'mm-failover-master-1',
                  'returner_address': 'tcp://127.0.0.1:49792'},
'retry_dns': 1,
'root_dir': 'C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1',
'sock_dir': 'run/minion',
'tcp_pub_port': 63366,
'tcp_pull_port': 63367,
'transport': 'zeromq',
'user': 'EC2AMAZ-0TPMPCB\\Administrator',
'verify_master_pubkey_sign': True}

Diff:

+++ {'acceptance_wait_time': 0.5,
@@ -15,27 +15,27 @@
                     '%(message)s',
  'log_level_logfile': 'debug',
  'loop_interval': 0.05,
- 'master': ['127.0.0.1:63007', '127.0.0.2:63007'],
+ 'master': ['127.0.0.1:62350', '127.0.0.2:62350'],
  'master_alive_interval': 10,
- 'master_port': 63007,
+ 'master_port': 62350,
  'master_tries': -1,
  'master_type': 'failover',
  'pidfile': 'run/minion.pid',
  'pillar_roots': {'base': ['C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\pillar-tree\\base'],
                   'prod': ['C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\pillar-tree\\prod']},
  'pki_dir': 'pki',
- 'publish_port': 63006,
+ 'publish_port': 62349,
  'pytest-minion': {'log': {'host': '127.0.0.1',
                            'level': 'debug',
-                           'port': 49793,
+                           'port': 49771,
                            'prefix': "SaltMinion(id='mm-failover-minion-1')"},
                    'master-id': 'mm-failover-master-1',
-                   'returner_address': 'tcp://127.0.0.1:49792'},
+                   'returner_address': 'tcp://127.0.0.1:49770'},
  'retry_dns': 1,
  'root_dir': 'C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1',
  'sock_dir': 'run/minion',
- 'tcp_pub_port': 63366,
- 'tcp_pull_port': 63367,
+ 'tcp_pub_port': 62792,
+ 'tcp_pull_port': 62793,
  'transport': 'zeromq',
- 'user': 'EC2AMAZ-0TPMPCB\\Administrator',
+ 'user': 'EC2AMAZ-NCQ5AGA\\Administrator',
  'verify_master_pubkey_sign': True}

Both masters start up successfully:

22:56:20,411 [pytestshellutils.shell                                                          :1214][DEBUG   ][MainProcess(4028)] All start check callbacks executed for SaltMaster(id='mm-failover-master-1', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\conf\\master', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_master.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None)
22:56:20,442 [pytestshellutils.shell                                                          :804 ][INFO    ][MainProcess(4028)] The SaltMaster(id='mm-failover-master-1', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\conf\\master', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-master-1\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_master.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None) factory is running after 1 attempts. Took 16.75 seconds
[...]
22:56:48,458 [pytestshellutils.shell                                                          :1214][DEBUG   ][MainProcess(4028)] All start check callbacks executed for SaltMaster(id='mm-failover-master-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-master-2\\conf\\master', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-master-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_master.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None)
22:56:48,458 [pytestshellutils.shell                                                          :804 ][INFO    ][MainProcess(4028)] The SaltMaster(id='mm-failover-master-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-master-2\\conf\\master', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-master-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_master.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None) factory is running after 1 attempts. Took 27.92 seconds

The first minion starts up sucessfully:

22:57:13,550 [salt.minion                                                                     :3094][INFO    ][MainProcess(9308)] [SaltMinion(id='mm-failover-minion-1')] Minion is ready to receive requests!
[...]
22:57:15,113 [pytestshellutils.shell                                                          :804 ][INFO    ][MainProcess(4028)] The SaltMinion(id='mm-failover-minion-1', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None) factory is running after 1 attempts. Took 26.59 seconds

The second one crashes immediately:

22:57:15,159 [pytestshellutils.shell                                                          :764 ][INFO    ][MainProcess(4028)] Starting SaltMinion(id='mm-failover-minion-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-2\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None). Attempt: 1 of 3
22:57:15,159 [pytestshellutils.customtypes                                                    :85  ][DEBUG   ][MainProcess(4028)] Running SaltDaemon._set_started_at()
22:57:15,175 [pytestshellutils.shell                                                          :322 ][INFO    ][MainProcess(4028)] SaltMinion(id='mm-failover-minion-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-2\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None) is running ['C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', 'C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', '--config-dir=C:\\Windows\\Temp\\stsuite\\mm-failover-minion-2\\conf', '--log-level=critical'] in CWD: C:\Users\Administrator\AppData\Local\Temp\kitchen\testing ...
22:57:15,238 [pytestshellutils.shell                                                          :214 ][INFO    ][MainProcess(4028)] Stopping SaltMinion(id='mm-failover-minion-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-2\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-failover-minion-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None)
22:57:15,238 [pytestshellutils.customtypes                                                    :85  ][DEBUG   ][MainProcess(4028)] Running Daemon._terminate_processes_matching_listen_ports()
22:57:15,238 [pytestshellutils.customtypes                                                    :85  ][DEBUG   ][MainProcess(4028)] Running Daemon._remove_factory_from_stats_processes()

Presumably because of a memory error, since all the other daemons now report one:

 Command Line: ['C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', 'C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', '--config-dir=C:\\Windows\\Temp\\stsuite\\mm-failover-minion-1\\conf', '--log-level=critical']
 Returncode: 1
 Process Output:
   >>>>> STDERR >>>>>
22:57:04,691 [salt.minion                                                                   :638 ][CRITICAL][       MainProcess(9308)] 'master_type' set to 'failover' but 'retry_dns' is not 0. Setting 'retry_dns' to 0 to failover to the next master on DNS errors.
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "c:\python38\lib\multiprocessing\spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "c:\python38\lib\multiprocessing\spawn.py", line 126, in _main
    self = reduction.pickle.load(from_parent)
MemoryError
[ERROR   ] An un-handled exception was caught by Salt's global exception handler:
MemoryError: 
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "c:\python38\lib\multiprocessing\spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "c:\python38\lib\multiprocessing\spawn.py", line 126, in _main
    self = reduction.pickle.load(from_parent)
MemoryError

   <<<<< STDERR <<<<<

For the next test, both masters again start successfully, but the first minion crashes during authentication:

22:58:30,856 [salt.loaded.ext.engines.pytest_engine                                           :160 ][DEBUG   ][Engine(salt.loaded.ext.engines.pytest_engine)(14996)] [SaltMinion(id='mm-minion-1')] <PyTestEventForwardEngine role='minion' id='mm-minion-1', returner_address='tcp://127.0.0.1:49792' running=True> Received Event; TAG: '__master_connected' DATA: {'master': '127.0.0.1', '_stamp': '2022-12-06T22:58:30.841103'}
22:58:30,856 [saltfactories.plugins.event_listener                                            :202 ][INFO    ][MainProcess(4028)] EventListener(timeout=120, address='tcp://127.0.0.1:49792') received event: Event(daemon_id='mm-minion-1', tag='__master_connected', stamp=datetime.datetime(2022, 12, 6, 22, 58, 30, 841103), data={'master': '127.0.0.1'}, full_data={'master': '127.0.0.1', '_stamp': '2022-12-06T22:58:30.841103'}, expire_seconds=120, _expire_at=datetime.datetime(2022, 12, 6, 23, 0, 30, 841103))
22:58:30,856 [salt.loaded.ext.engines.pytest_engine                                           :167 ][INFO    ][Engine(salt.loaded.ext.engines.pytest_engine)(14996)] [SaltMinion(id='mm-minion-1')] <PyTestEventForwardEngine role='minion' id='mm-minion-1', returner_address='tcp://127.0.0.1:49792' running=True> forwarded event: ('mm-minion-1', '__master_connected', {'master': '127.0.0.1', '_stamp': '2022-12-06T22:58:30.841103'})
22:58:30,872 [saltfactories.plugins.event_listener                                            :217 ][DEBUG   ][MainProcess(4028)] EventListener(timeout=120, address='tcp://127.0.0.1:49792') store size after event received: 15
22:58:31,466 [saltfactories.plugins.event_listener                                            :329 ][DEBUG   ][MainProcess(4028)] EventListener(timeout=120, address='tcp://127.0.0.1:49792') is checking for event patterns happening after 2022-12-06T22:58:03.544214: {('mm-master-1', 'salt/minion/mm-minion-1/start')}
22:58:31,466 [saltfactories.plugins.event_listener                                            :358 ][DEBUG   ][MainProcess(4028)] EventListener(timeout=120, address='tcp://127.0.0.1:49792') did not find any matching event patterns happening after 2022-12-06T22:58:03.544214
22:58:31,700 [salt.transport.zeromq                                                           :675 ][DEBUG   ][PubServerChannel._publish_daemon(7040)] [SaltMaster(id='mm-master-1')] ZeroMQ event: {'event': 512, 'value': 1704, 'endpoint': b'tcp://127.0.0.1:63776', 'description': 'EVENT_DISCONNECTED'}
22:58:32,981 [pytestshellutils.utils.processes                                                :255 ][INFO    ][MainProcess(4028)] Terminating process list:
['<could not be retrived; dead process: psutil.Process(pid=8840, '
 "name='python.exe', status='terminated', started='22:58:03')>"]
22:58:32,981 [pytestshellutils.shell                                                          :298 ][INFO    ][MainProcess(4028)] SaltMinion ProcessResult
 Command Line: ['C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', 'C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', '--config-dir=C:\\Windows\\Temp\\stsuite\\mm-minion-1\\conf', '--log-level=critical']
 Returncode: 1073741845
 Process Output:
   >>>>> STDERR >>>>>
Assertion failed: error not defined [8] (C:\projects\libzmq\src\ip.cpp:488)
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "c:\python38\lib\multiprocessing\spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "c:\python38\lib\multiprocessing\spawn.py", line 126, in _main
    self = reduction.pickle.load(from_parent)
  File "C:\Users\Administrator\AppData\Local\Temp\kitchen\testing\salt\engines\__init__.py", line 8, in <module>
    import salt.loader
  File "C:\Users\Administrator\AppData\Local\Temp\kitchen\testing\salt\loader\__init__.py", line 23, in <module>
    import salt.utils.event
  File "C:\Users\Administrator\AppData\Local\Temp\kitchen\testing\salt\utils\event.py", line 63, in <module>
    import salt.channel.client
  File "C:\Users\Administrator\AppData\Local\Temp\kitchen\testing\salt\channel\client.py", line 13, in <module>
    import salt.crypt
  File "C:\Users\Administrator\AppData\Local\Temp\kitchen\testing\salt\crypt.py", line 55, in <module>
    from Cryptodome.Cipher import AES, PKCS1_OAEP
  File "C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\Cryptodome\Cipher\__init__.py", line 29, in <module>
    from Cryptodome.Cipher._mode_cfb import _create_cfb_cipher
  File "C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\Cryptodome\Cipher\_mode_cfb.py", line 37, in <module>
    raw_cfb_lib = load_pycryptodome_raw_lib("Cryptodome.Cipher._raw_cfb","""
  File "C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\Cryptodome\Util\_raw_api.py", line 297, in load_pycryptodome_raw_lib
    raise OSError("Cannot load native module '%s': %s" % (name, ", ".join(attempts)))
OSError: Cannot load native module 'Cryptodome.Cipher._raw_cfb': Trying '_raw_cfb.cp38-win_amd64.pyd': cannot load library 'C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\Cryptodome\Util\..\Cipher\_raw_cfb.cp38-win_amd64.pyd': error 0x7e.  Additionally, ctypes.util.find_library() did not manage to locate a library called 'C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\lib\\site-packages\\Cryptodome\\Util\\..\\Cipher\\_raw_cfb.cp38-win_amd64.pyd', Trying '_raw_cfb.pyd': cannot load library 'C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\Cryptodome\Util\..\Cipher\_raw_cfb.pyd': error 0x5af

Unhandled Exception: OutOfMemoryException.

   <<<<< STDERR <<<<<

The second attempt for the first minion apparently works:

22:58:32,981 [pytestshellutils.shell                                                          :764 ][INFO    ][MainProcess(4028)] Starting SaltMinion(id='mm-minion-1', config_file='C:\\Windows\\Temp\\stsuite\\mm-minion-1\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-minion-1\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name="SaltMinion(id='mm-minion-1')"). Attempt: 2 of 3

But the pytest engine crashes, presumably again because of memory issues:

22:58:49,341 [salt.utils.process                                                              :549 ][INFO    ][MainProcess(15916)] [SaltMinion(id='mm-minion-1')] Process <class 'salt.engines.Engine'> (17052) died with exit status 3762504530, restarting...

The second minion is hopeless of course and crashes at different points with a MemoryError:

22:58:49,684 [pytestshellutils.shell                                                          :764 ][INFO    ][MainProcess(4028)] Starting SaltMinion(id='mm-minion-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-minion-2\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-minion-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None). Attempt: 1 of 3
Traceback (most recent call last):
  File "C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\pytestshellutils\shell.py", line 1195, in run_start_checks
    ret = start_check(timeout_at)
  File "C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\pytestshellutils\customtypes.py", line 86, in __call__
    return self.func(*_args, **_kwargs)
  File "C:\Windows\Temp\nox\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\lib\site-packages\saltfactories\bases.py", line 708, in _check_start_events
    raise FactoryNotStarted("{} is no longer running".format(self))
pytestshellutils.exceptions.FactoryNotStarted: SaltMinion(id='mm-minion-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-minion-2\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-minion-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None) is no longer running
22:58:51,262 [pytestshellutils.shell                                                          :214 ][INFO    ][MainProcess(4028)] Stopping SaltMinion(id='mm-minion-2', config_file='C:\\Windows\\Temp\\stsuite\\mm-minion-2\\conf\\minion', config_dir='C:\\Windows\\Temp\\stsuite\\mm-minion-2\\conf', python_executable='C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', cwd=WindowsPath('C:/Users/Administrator/AppData/Local/Temp/kitchen/testing'), slow_stop=True, timeout=None, script_name='C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', base_script_args=[], check_ports=[], extra_cli_arguments_after_first_start_failure=['--log-level=debug'], display_name=None)
22:58:51,262 [pytestshellutils.utils.processes                                                :404 ][INFO    ][MainProcess(4028)] Terminating process list: [psutil.Process(pid=13580, status='terminated', started='22:58:49')]
22:58:51,262 [pytestshellutils.utils.processes                                                :320 ][INFO    ][MainProcess(4028)] Terminating process list. 1st step. kill: False, slow stop: True
22:58:51,262 [pytestshellutils.utils.processes                                                :255 ][INFO    ][MainProcess(4028)] Terminating process list:
['<could not be retrived; dead process: psutil.Process(pid=13580, '
 "status='terminated', started='22:58:49')>"]
22:58:51,262 [pytestshellutils.shell                                                          :298 ][INFO    ][MainProcess(4028)] SaltMinion ProcessResult
 Command Line: ['C:\\Windows\\Temp\\nox\\test-parametrized-3-crypto-none-transport-zeromq-coverage-true\\Scripts\\python.EXE', 'C:\\Windows\\Temp\\stsuite\\scripts\\cli_salt_minion.py', '--config-dir=C:\\Windows\\Temp\\stsuite\\mm-minion-2\\conf', '--log-level=critical']
 Returncode: 1
 Process Output:
   >>>>> STDERR >>>>>
Error in sitecustomize; set PYTHONVERBOSE for traceback:
MemoryError: 
Traceback (most recent call last):
  File "C:\Windows\Temp\stsuite\scripts\cli_salt_minion.py", line 39, in <module>
    from salt.scripts import salt_minion
  File "C:\Users\Administrator\AppData\Local\Temp\kitchen\testing\salt\__init__.py", line 146, in <module>
    import salt._logging  # isort:skip
  File "C:\Users\Administrator\AppData\Local\Temp\kitchen\testing\salt\_logging\__init__.py", line 12, in <module>
    from salt._logging.impl import (
MemoryError

   <<<<< STDERR <<<<<

Linux swarm tests: At least on Linux, this might be caused by some processes not cleanly exiting. For example, the current Alma build shows the multimaster scenario minions/masters running not only shortly after the tests have finished, but for the entire rest of the run, likely causing the failed swarm tests (which are failing with MemoryError as well):

I, [2022-12-07T00:19:07.059546 #74081]  INFO -- py3-alma-8-x86-64: 
I, [2022-12-07T00:19:23.968028 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_get_opts [32mPASSED[0m
I, [2022-12-07T00:19:23.968158 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:19:23.968801 #74081]  INFO -- py3-alma-8-x86-64:   ...........................  System  -  CPU:  56.90 %   MEM:  66.50 % (Virtual Memory)
I, [2022-12-07T00:19:23.973362 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.10 %   MEM:   3.55 % (RSS)   MEM SUM:   4.37 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:19:23.978832 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMaster(id='master-6eklIP')  -  CPU:   0.60 %   MEM:   1.06 % (RSS)   MEM SUM:  15.33 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:19:23.983861 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-1')  -  CPU:   0.00 %   MEM:   1.02 % (RSS)   MEM SUM:  12.10 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:19:23.988604 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   1.02 % (RSS)   MEM SUM:  12.10 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:19:23.992641 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-1')  -  CPU:   0.90 %   MEM:   2.89 % (RSS)   MEM SUM:   4.34 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:19:23.996560 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-2')  -  CPU:   0.90 %   MEM:   0.97 % (RSS)   MEM SUM:   2.42 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:19:24.005539 #74081]  INFO -- py3-alma-8-x86-64: 
I, [2022-12-07T00:19:53.656706 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/scenarios/setup/test_install.py::test_wheel[USE_STATIC_REQUIREMENTS=1] [32mPASSED[0m
I, [2022-12-07T00:19:53.656847 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:19:53.657586 #74081]  INFO -- py3-alma-8-x86-64:   ...........................  System  -  CPU:  41.30 %   MEM:  67.00 % (Virtual Memory)
I, [2022-12-07T00:19:53.662930 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.10 %   MEM:   3.51 % (RSS)   MEM SUM:   4.31 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:19:53.668069 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMaster(id='master-6eklIP')  -  CPU:   0.00 %   MEM:   1.03 % (RSS)   MEM SUM:  15.06 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:19:53.673265 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-1')  -  CPU:   0.00 %   MEM:   0.98 % (RSS)   MEM SUM:  11.84 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:19:53.678239 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   0.98 % (RSS)   MEM SUM:  11.82 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:19:53.682123 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-1')  -  CPU:   0.20 %   MEM:   2.85 % (RSS)   MEM SUM:   4.28 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:19:53.688446 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-2')  -  CPU:   0.20 %   MEM:   0.94 % (RSS)   MEM SUM:   2.35 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:19:53.692702 #74081]  INFO -- py3-alma-8-x86-64: 
[...]
I, [2022-12-07T00:23:39.793080 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/scenarios/setup/test_man.py::test_man_pages [32mPASSED[0m
I, [2022-12-07T00:23:39.793187 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:23:39.793748 #74081]  INFO -- py3-alma-8-x86-64:   ...........................  System  -  CPU:  53.20 %   MEM:  67.10 % (Virtual Memory)
I, [2022-12-07T00:23:39.798881 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.20 %   MEM:   3.30 % (RSS)   MEM SUM:   4.11 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:23:39.804410 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMaster(id='master-6eklIP')  -  CPU:   0.00 %   MEM:   0.97 % (RSS)   MEM SUM:  14.69 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:23:39.809740 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-1')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:  11.68 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:23:39.815219 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:  11.65 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:23:39.819419 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-1')  -  CPU:   0.20 %   MEM:   2.64 % (RSS)   MEM SUM:   4.07 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:23:39.823613 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-2')  -  CPU:   0.20 %   MEM:   0.92 % (RSS)   MEM SUM:   2.34 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:23:40.180430 #74081]  INFO -- py3-alma-8-x86-64: 
I, [2022-12-07T00:28:56.846014 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/scenarios/swarm/test_minion_swarm.py::test_ping [31mFAILED[0m
I, [2022-12-07T00:28:56.846130 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:28:56.846667 #74081]  INFO -- py3-alma-8-x86-64:   ....................................  System  -  CPU:  36.20 %   MEM:  95.60 % (Virtual Memory)
I, [2022-12-07T00:28:56.852858 #74081]  INFO -- py3-alma-8-x86-64:   ............  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.10 %   MEM:   3.30 % (RSS)   MEM SUM:   4.11 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.859439 #74081]  INFO -- py3-alma-8-x86-64:   ............  SaltMaster(id='master-6eklIP')  -  CPU:   0.00 %   MEM:   0.98 % (RSS)   MEM SUM:  14.72 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:56.866052 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMaster(id='mm-master-1')  -  CPU:   0.00 %   MEM:   0.95 % (RSS)   MEM SUM:  11.98 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:56.872130 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:  11.68 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:56.877203 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMinion(id='mm-minion-1')  -  CPU:   0.20 %   MEM:   2.64 % (RSS)   MEM SUM:   4.07 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:28:56.882253 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMinion(id='mm-minion-2')  -  CPU:   0.20 %   MEM:   0.92 % (RSS)   MEM SUM:   2.34 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:28:56.888130 #74081]  INFO -- py3-alma-8-x86-64:   ......  SaltMaster(id='swarm-master-3UHy6c')  -  CPU:   0.00 %   MEM:   1.02 % (RSS)   MEM SUM:  12.12 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:56.893121 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-0-uiBvwN')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.898085 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-1-i7Diyw')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.903044 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-2-zP13Mh')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.908108 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-3-MzTbn9')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.913252 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-4-xKwUw0')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.918268 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-5-dLLi7m')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.923186 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-6-v32udZ')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.928133 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-7-P4516w')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.933500 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-8-xy9Vc4')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.938520 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-9-snUit2')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.944703 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-10-lbky9G')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.950174 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-11-Qn5sMf')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.955186 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-12-UIAbRs')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.960133 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-13-QVE3Dc')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.965198 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-14-1X8F5D')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.970152 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-15-6HrBEw')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.975127 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-16-xGtJWY')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.980165 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-17-M5764n')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.985131 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-18-PlQSm5')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:56.990425 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-19-oF0DVu')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.024278 #74081]  INFO -- py3-alma-8-x86-64: 
I, [2022-12-07T00:28:57.144565 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/scenarios/swarm/test_minion_swarm.py::test_ping_one [31mFAILED[0m
I, [2022-12-07T00:28:57.144732 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:28:57.144996 #74081]  INFO -- py3-alma-8-x86-64:   ....................................  System  -  CPU:  54.20 %   MEM:  95.60 % (Virtual Memory)
I, [2022-12-07T00:28:57.151291 #74081]  INFO -- py3-alma-8-x86-64:   ............  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.00 %   MEM:   3.30 % (RSS)   MEM SUM:   4.11 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.157871 #74081]  INFO -- py3-alma-8-x86-64:   ............  SaltMaster(id='master-6eklIP')  -  CPU:   0.00 %   MEM:   0.98 % (RSS)   MEM SUM:  14.72 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:57.164121 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMaster(id='mm-master-1')  -  CPU:   0.00 %   MEM:   0.95 % (RSS)   MEM SUM:  11.98 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:57.170325 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:  11.68 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:57.175380 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMinion(id='mm-minion-1')  -  CPU:   0.00 %   MEM:   2.64 % (RSS)   MEM SUM:   4.07 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:28:57.180479 #74081]  INFO -- py3-alma-8-x86-64:   ..............  SaltMinion(id='mm-minion-2')  -  CPU:   0.00 %   MEM:   0.92 % (RSS)   MEM SUM:   2.34 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:28:57.186301 #74081]  INFO -- py3-alma-8-x86-64:   ......  SaltMaster(id='swarm-master-3UHy6c')  -  CPU:   0.00 %   MEM:   1.02 % (RSS)   MEM SUM:  12.12 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:28:57.191353 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-0-uiBvwN')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.196267 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-1-i7Diyw')  -  CPU:   3.40 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.201239 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-2-zP13Mh')  -  CPU:   3.40 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.206344 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-3-MzTbn9')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.211305 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-4-xKwUw0')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.216190 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-5-dLLi7m')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.221117 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-6-v32udZ')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.226070 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-7-P4516w')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.231040 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-8-xy9Vc4')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.236862 #74081]  INFO -- py3-alma-8-x86-64:   ....  SaltMinion(id='swarm-minion-9-snUit2')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.242443 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-10-lbky9G')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.247425 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-11-Qn5sMf')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.252685 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-12-UIAbRs')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.257597 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-13-QVE3Dc')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.65 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.262807 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-14-1X8F5D')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.267630 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-15-6HrBEw')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.273032 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-16-xGtJWY')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.277929 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-17-M5764n')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.282851 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-18-PlQSm5')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:57.287634 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='swarm-minion-19-oF0DVu')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:   1.64 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:59.895079 #74081]  INFO -- py3-alma-8-x86-64: 
I, [2022-12-07T00:28:59.975555 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/unit/test_beacons.py::test_beacon_process [32mPASSED[0m
I, [2022-12-07T00:28:59.975593 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:28:59.975605 #74081]  INFO -- py3-alma-8-x86-64:   ...........................  System  -  CPU:  67.40 %   MEM:  66.40 % (Virtual Memory)
I, [2022-12-07T00:28:59.975615 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.00 %   MEM:   3.30 % (RSS)   MEM SUM:   4.11 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:28:59.991002 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMaster(id='master-6eklIP')  -  CPU:   0.00 %   MEM:   0.98 % (RSS)   MEM SUM:  14.72 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:29:00.001046 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-1')  -  CPU:   0.00 %   MEM:   0.95 % (RSS)   MEM SUM:  11.99 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:29:00.001070 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   0.94 % (RSS)   MEM SUM:  11.68 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:29:00.011783 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-1')  -  CPU:   0.00 %   MEM:   2.64 % (RSS)   MEM SUM:   4.07 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:29:00.016886 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-2')  -  CPU:   0.40 %   MEM:   0.92 % (RSS)   MEM SUM:   2.34 % (RSS)   CHILD PROCS: 2
[...]
I, [2022-12-07T00:51:16.511428 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/unit/utils/scheduler/test_skip.py::test_run_after_skip_range [32mPASSED[0m
I, [2022-12-07T00:51:16.511518 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:51:16.511545 #74081]  INFO -- py3-alma-8-x86-64:   ...........................  System  -  CPU:  95.60 %   MEM:  76.10 % (Virtual Memory)
I, [2022-12-07T00:51:16.522037 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.00 %   MEM:   3.32 % (RSS)   MEM SUM:   4.13 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:51:16.525724 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMaster(id='master-6eklIP')  -  CPU:   0.00 %   MEM:   0.99 % (RSS)   MEM SUM:  14.76 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:51:16.541850 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   0.98 % (RSS)   MEM SUM:  11.92 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:51:16.541908 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-1')  -  CPU:   2.30 %   MEM:   2.64 % (RSS)   MEM SUM:   4.08 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:51:16.553026 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-2')  -  CPU:   2.30 %   MEM:   0.92 % (RSS)   MEM SUM:   2.35 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:51:16.620776 #74081]  INFO -- py3-alma-8-x86-64: 
I, [2022-12-07T00:51:17.856274 #74081]  INFO -- py3-alma-8-x86-64: tests/pytests/unit/utils/scheduler/test_skip.py::test_run_seconds_skip [32mPASSED[0m
I, [2022-12-07T00:51:17.856388 #74081]  INFO -- py3-alma-8-x86-64: [1m----------------------------- Processes Statistics -----------------------------[0m
I, [2022-12-07T00:51:17.857039 #74081]  INFO -- py3-alma-8-x86-64:   ...........................  System  -  CPU:  22.80 %   MEM:  76.90 % (Virtual Memory)
I, [2022-12-07T00:51:17.862855 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMinion(id='minion-mCLWRd')  -  CPU:   0.00 %   MEM:   3.32 % (RSS)   MEM SUM:   4.13 % (RSS)   CHILD PROCS: 1
I, [2022-12-07T00:51:17.868576 #74081]  INFO -- py3-alma-8-x86-64:   ...  SaltMaster(id='master-6eklIP')  -  CPU:   0.00 %   MEM:   0.99 % (RSS)   MEM SUM:  14.76 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:51:17.874006 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMaster(id='mm-master-2')  -  CPU:   0.00 %   MEM:   0.98 % (RSS)   MEM SUM:  12.06 % (RSS)   CHILD PROCS: 12
I, [2022-12-07T00:51:17.878081 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-1')  -  CPU:   0.00 %   MEM:   2.64 % (RSS)   MEM SUM:   4.08 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:51:17.882233 #74081]  INFO -- py3-alma-8-x86-64:   .....  SaltMinion(id='mm-minion-2')  -  CPU:   0.00 %   MEM:   0.92 % (RSS)   MEM SUM:   2.35 % (RSS)   CHILD PROCS: 2
I, [2022-12-07T00:51:17.984637 #74081]  INFO -- py3-alma-8-x86-64: 

From the (passing) Arch log:

00:51:10,419 [salt.master                                                                     :472 ][DEBUG   ][FileserverUpdate(254274)] [SaltMaster(id='mm-master-2')] Performing fileserver updates for items with an update interval of 60
00:51:10,420 [salt.master                                                                     :454 ][DEBUG   ][FileserverUpdate(254274)] [SaltMaster(id='mm-master-2')] Updating roots fileserver cache
00:51:10,421 [salt.master                                                                     :477 ][DEBUG   ][FileserverUpdate(254274)] [SaltMaster(id='mm-master-2')] Completed fileserver updates for items with an update interval of 60, waiting 60 seconds
00:51:10,432 [salt.utils.process                                                              :1144][DEBUG   ][MainProcess(3145)] Subprocess Schedule(name=test_skip_during_range, jid=20221207005110398558) added
00:51:10,437 [saltfactories.plugins                                                           :76  ][DEBUG   ][MainProcess(3145)] ======= PASSED tests/pytests/unit/utils/scheduler/test_skip.py::test_skip_during_range ========

Notice that mm-master-2 still logs during the last unit tests. I could find similar behavior in other pull requests.

Not sure about Windows, it seems the blackout daemons are stopped, but memory usage is still rather high at 50% with only two daemons:

I, [2022-12-06T22:56:04.226866 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/blackout/test_minion_blackout.py::test_blackout_nonwhitelist PASSED
I, [2022-12-06T22:56:04.227535 #5321]  INFO -- py3-windows-2019-x64: ---------------------------- Processes Statistics -----------------------------
I, [2022-12-06T22:56:04.228166 #5321]  INFO -- py3-windows-2019-x64:   ...............................  System  -  CPU:  38.70 %   MEM:  63.10 % (Virtual Memory)  SWAP:  60.60 %
I, [2022-12-06T22:56:04.228805 #5321]  INFO -- py3-windows-2019-x64:   .......................  Test Suite Run  -  CPU:   1.40 %   MEM:  10.51 % (RSS)   MEM SUM:  56.20 % (RSS)   CHILD PROCS: 37
I, [2022-12-06T22:56:04.229425 #5321]  INFO -- py3-windows-2019-x64:   .......  SaltMaster(id='master-BmF5kC')  -  CPU:   0.00 %   MEM:   0.04 % (RSS)   MEM SUM:  18.48 % (RSS)   CHILD PROCS: 13
I, [2022-12-06T22:56:04.230046 #5321]  INFO -- py3-windows-2019-x64:   .......  SaltMinion(id='minion-CiKZiH')  -  CPU:   0.00 %   MEM:   0.04 % (RSS)   MEM SUM:   4.38 % (RSS)   CHILD PROCS: 2
I, [2022-12-06T22:56:04.230663 #5321]  INFO -- py3-windows-2019-x64:   .....  SaltMaster(id='blackout-master')  -  CPU:   0.00 %   MEM:   0.04 % (RSS)   MEM SUM:  17.67 % (RSS)   CHILD PROCS: 13
I, [2022-12-06T22:56:04.231286 #5321]  INFO -- py3-windows-2019-x64:   ...  SaltMinion(id='blackout-minion-1')  -  CPU:   0.00 %   MEM:   0.04 % (RSS)   MEM SUM:   2.58 % (RSS)   CHILD PROCS: 2
I, [2022-12-06T22:56:04.231900 #5321]  INFO -- py3-windows-2019-x64:   ...  SaltMinion(id='blackout-minion-2')  -  CPU:   0.00 %   MEM:   0.04 % (RSS)   MEM SUM:   2.58 % (RSS)   CHILD PROCS: 2
I, [2022-12-06T22:56:04.232520 #5321]  INFO -- py3-windows-2019-x64: 
I, [2022-12-06T22:56:04.233152 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_ping[SaltMinion~=3002] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.233769 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_highstate[SaltMinion~=3002] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.234406 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_cp[SaltMinion~=3002] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.235027 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_ping[SaltMinion~=3003] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.235647 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_highstate[SaltMinion~=3003] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.236280 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_cp[SaltMinion~=3003] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.237000 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_ping[SaltMinion~=3004] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.237624 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_highstate[SaltMinion~=3004] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.238239 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/compat/test_with_versions.py::test_cp[SaltMinion~=3004] SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.238856 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/daemons/test_salt_as_daemons.py::test_salt_master_as_daemon SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:56:04.239475 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/daemons/test_salt_as_daemons.py::test_salt_minion_as_daemon SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:57:16.455978 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/failover/multimaster/test_failover_master.py::test_pki ERROR
I, [2022-12-06T22:57:17.002948 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/failover/multimaster/test_failover_master.py::test_return_to_assigned_master ERROR
I, [2022-12-06T22:57:18.190074 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/failover/multimaster/test_failover_master.py::test_failover_to_second_master ERROR
I, [2022-12-06T22:57:19.206152 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/failover/multimaster/test_failover_master.py::test_minion_reconnection ERROR
I, [2022-12-06T22:57:19.206844 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/failover/multimaster/test_failover_master.py::test_minions_alive_with_no_master SKIPPED (Skipped on Windows)
I, [2022-12-06T22:58:56.004307 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/test_multimaster.py::test_basic_command_return ERROR
I, [2022-12-06T22:58:56.005010 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/test_multimaster.py::test_stopped_first_master ERROR
I, [2022-12-06T22:58:56.005637 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/test_multimaster.py::test_stopped_second_master ERROR
I, [2022-12-06T22:58:56.006261 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/test_multimaster.py::test_minion_reconnection_attempts ERROR
I, [2022-12-06T22:58:56.006909 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/test_offline_master.py::test_minion_hangs_on_master_failure_50814 SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.007534 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/beacons/test_inotify.py::test_beacons_duplicate_53344 SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.008160 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_ping SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.008839 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_echo SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.009471 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_version SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.010094 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_conf_test SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.010717 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_cross_test SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.011360 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_outputter SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.012000 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_fib SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.012796 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_collatz SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T22:58:56.013589 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/multimaster/modules/test_test.py::test_get_opts SKIPPED (Test is not whitelisted for Windows)
I, [2022-12-06T23:01:30.100177 #5321]  INFO -- py3-windows-2019-x64: tests/pytests/scenarios/setup/test_install.py::test_wheel[USE_STATIC_REQUIREMENTS=1] PASSED
I, [2022-12-06T23:01:30.100863 #5321]  INFO -- py3-windows-2019-x64: ---------------------------- Processes Statistics -----------------------------
I, [2022-12-06T23:01:30.101482 #5321]  INFO -- py3-windows-2019-x64:   ...........................  System  -  CPU:  53.20 %   MEM:  50.00 % (Virtual Memory)  SWAP:  43.80 %
I, [2022-12-06T23:01:30.102101 #5321]  INFO -- py3-windows-2019-x64:   ...................  Test Suite Run  -  CPU:   4.10 %   MEM:  10.52 % (RSS)   MEM SUM:  33.35 % (RSS)   CHILD PROCS: 17
I, [2022-12-06T23:01:30.102718 #5321]  INFO -- py3-windows-2019-x64:   ...  SaltMaster(id='master-BmF5kC')  -  CPU:   0.00 %   MEM:   0.04 % (RSS)   MEM SUM:  18.47 % (RSS)   CHILD PROCS: 13
I, [2022-12-06T23:01:30.103336 #5321]  INFO -- py3-windows-2019-x64:   ...  SaltMinion(id='minion-CiKZiH')  -  CPU:   0.00 %   MEM:   0.04 % (RSS)   MEM SUM:   4.37 % (RSS)   CHILD PROCS: 2
I, [2022-12-06T23:01:30.103924 #5321]  INFO -- py3-windows-2019-x64: 

lkubb avatar Dec 07 '22 12:12 lkubb

It seems not triggering an enormous test run by not touching pytests/conftest.py solved the memory issue with the Windows tests. I would propose to adapt the default configuration in pytests/conftest.py in a separate PR, if at all.

The zombie daemons are still a valid issue with the test suite though.

lkubb avatar Dec 07 '22 20:12 lkubb

It seems not triggering an enormous test run by not touching pytests/conftest.py solved the memory issue with the Windows tests. I would propose to adapt the default configuration in pytests/conftest.py in a separate PR, if at all.

The zombie daemons are still a valid issue with the test suite though.

there are some changes coming down the line that might make this less of an issue. at least hopefully.

whytewolf avatar Dec 07 '22 21:12 whytewolf

Any chance this makes it into 3006.0? Because it once again has merge conflicts that I would need to resolve and I'm not at home atm.

lkubb avatar Dec 09 '22 15:12 lkubb

unfortuntly, it is looking like it isn't going to make it into 3006. we are going to have to move it to 3007. we are cutting the rc extremely soon and that will be a feature freeze.

whytewolf avatar Dec 09 '22 16:12 whytewolf

That's a pity since this PR has been quite work-intensive to create and maintain mergeable. I know you guys are very busy though, really thankful for your work. :)

lkubb avatar Dec 09 '22 17:12 lkubb

@whytewolf any word on when this can be pulled in? Seems like lkubb has been really pushing to keep it alive (yes, I do see that the CI needs a little love). I was really hoping to see it in 3006, but I guess we can't always get what we want. It's been 4 or 5 months since you commented here though, so just giving it a bump.

mdschmitt avatar Apr 17 '23 02:04 mdschmitt

@whytewolf any word on when this can be pulled in? Seems like lkubb has been really pushing to keep it alive (yes, I do see that the CI needs a little love). I was really hoping to see it in 3006, but I guess we can't always get what we want. It's been 4 or 5 months since you commented here though, so just giving it a bump.

I can say with certainty this is not going into 3006. as we are right on the verge of releasing 3006. right now the last change was 3 days ago to this code. once @lkubb is happy and isn't changing code after the 3006 release we can revisit.

whytewolf avatar Apr 17 '23 03:04 whytewolf

Atm, I'm preparing this for the time after the release of 3006. The last changes I made (reusing the TLS connection, refactoring the utils module) were the last big eyesores I felt with this code. There is one more gripe regarding the SDB module currently overwriting all unrelated keys in a secret instead of patching only the requested key, which I might add here because it's a simple change. It seems reusing the TLS connection introduced a single minor test failure, will get that fixed before the next release.

I do have some stuff I'm trying hard not to include here to keep the scope manageable and was hoping to submit in follow-up PRs though. :)

TLDR: Will be finished changing code soon, just trying to make use of the additional time to increase code quality and predict possible review remarks.

lkubb avatar Apr 17 '23 07:04 lkubb

@whytewolf This code can be considered stable and review-ready again now, barring any bugs that crop up during continued usage. In its current form, it's still not thread-safe (ref https://github.com/saltstack/salt/issues/62382), but quite resilient. Adding on thread safety should not be too difficult, but I would prefer receiving some core dev input before tackling that, if at all (in this PR, given its scope).

lkubb avatar Apr 24 '23 12:04 lkubb

according to @s0undt3ch the error we are getting is caused by a sphinx extension.

looks like vaultpolicylexer is causing it. can you get that fixed?

whytewolf avatar May 11 '23 16:05 whytewolf

the error we are getting is caused by a sphinx extension.

Ah, it seems the release notes are being processed by the Jinja renderer now. I would assume this precedes Sphinx rendering, so the example will need to be wrapped in a {% raw %} tag. Not sure if there's anything I can do with the vaultpolicylexer? Unless I'm missing something.

Traceback (most recent call last):
  File "/home/runner/.local/bin/tools", line 8, in <module>
    sys.exit(main())
  File "/home/runner/.local/lib/python3.10/site-packages/ptscripts/__main__.py", line 26, in main
    parser.parse_args()
  File "/home/runner/.local/lib/python3.10/site-packages/ptscripts/parser.py", line 407, in parse_args
    options.func(options)
  File "/home/runner/.local/lib/python3.10/site-packages/ptscripts/parser.py", line 648, in __call__
    func(self.context, *bound.args, **bound.kwargs)
  File "/home/runner/work/salt/salt/tools/changelog.py", line 384, in update_release_notes
    content = template.render(
  File "/home/runner/.local/lib/python3.10/site-packages/jinja2/environment.py", line 1301, in render
    self.environment.handle_exception()
  File "/home/runner/.local/lib/python3.10/site-packages/jinja2/environment.py", line 936, in handle_exception
    raise rewrite_traceback_stack(source=source)
  File "doc/topics/releases/templates/3007.0.md.template", line 28, in top-level template code
    path "salt/data/minions/{{identity.entity.metadata.minion-id}}" {
  File "/home/runner/.local/lib/python3.10/site-packages/jinja2/environment.py", line 485, in getattr
    return getattr(obj, attribute)
jinja2.exceptions.UndefinedError: 'identity' is undefined
Error: Process completed with exit code 1.

lkubb avatar May 11 '23 17:05 lkubb

nope you were spot on with the raw.

whytewolf avatar May 11 '23 17:05 whytewolf

I would also like to get @dwoz to review this as well.

Ch3LL avatar May 22 '23 19:05 Ch3LL

@Ch3LL Just a friendly notice in case you missed it: I believe I have addressed/responded to all your remarks some time ago. Would you mind re-reviewing?

lkubb avatar Jun 15 '23 18:06 lkubb

@lkubb Thanks so much for this PR, the vault modules are ones that we've identified to move out into a Salt extension and we're planning to begin that extension with the awesome work you've put into this PR. Moving the vault modules out into their own repository will allow development on those modules to be done and released outside of the Salt core modules. I'll be looking at getting that code moved over in the next day or two, so please stay tuned. We definitely appreciate your patience. Thanks!

garethgreenaway avatar Jun 16 '23 00:06 garethgreenaway

@garethgreenaway Does that imply I will have to close this PR and create a new one against the new repository? I'm honestly not really motivated to once again rebase and possibly modify the tests for a slightly different test suite (have not really looked into saltext development so far). I have put so much of my time already into keeping this one alive. Any chance this can get merged before the code is moved over?

lkubb avatar Jun 16 '23 06:06 lkubb

@lkubb Thanks so much for this PR, the vault modules are ones that we've identified to move out into a Salt extension and we're planning to begin that extension with the awesome work you've put into this PR. Moving the vault modules out into their own repository will allow development on those modules to be done and released outside of the Salt core modules. I'll be looking at getting that code moved over in the next day or two, so please stay tuned. We definitely appreciate your patience. Thanks!

Why vault will be moved out of core Salt? This is pretty nice integration and as such I don't see any reason to not be part of the core product (and additional extensions needs to be installed).

What is the reason to opt-out?

voyvodov avatar Jun 16 '23 07:06 voyvodov

Why vault will be moved out of core Salt? This is pretty nice integration and as such I don't see any reason to not be part of the core product (and additional extensions needs to be installed).

I'm surprised by this decision as well. Vault is a natural fit for Salt and, if used at all, is a core part of one's state/pillar tree rendering.

I get the case for extracting modules like elasticsearch, which can be installed and loaded dynamically during a state run. A few of my other Vault modules might in fact be a good fit for an extension (e.g. for the database (https://github.com/saltstack/salt/pull/63314), pki and ssh backends as well as for managing plugins, approles etc.)*

But the core found in this PR is literally at the core of my Salt environment. It should (™) also be quite stable regarding maintenance. Imho, it would make sense to include this part into Salt itself and create a separate extension for deeper integration and faster iteration.

* see my Vault formula, so far only the vault_db module has automated tests found in the linked draft PR though

lkubb avatar Jun 16 '23 09:06 lkubb