PokemonRedExperiments icon indicating copy to clipboard operation
PokemonRedExperiments copied to clipboard

Create a shared database to post training results and beating the game collaboratively?

Open S3mz opened this issue 2 years ago • 6 comments

Yo Peter,

Congrats for this project and thanks for bringing back the fun of beating this game again.

What if we all used our shared resources to allocate CPU power and training results until we beat this game together

I was wondering if it would be useful to you (and eventually this community of PokeAiNerds) to create a DB with an endpoint to store and retrieve training results and connect it to the repo. I could build it and let it open for donations and we as a community could share the costs and the training results to feed the model.

We could also create tasks groups with specific configs to tackle different areas of the game with different problems to solve until we find the great solution.

Well lmk if this is smth that sounds interesting to you and i get my hands to work

S3mz avatar Oct 18 '23 21:10 S3mz

Love this idea! It would be great if folks could share their results and benefit from others experiments. A DB + endpoint would be fairly complex, something as simple as a google doc could be a good first step maybe? I'd have to think about it. But would be great to have some discussion on this.

PWhiddy avatar Oct 19 '23 01:10 PWhiddy

@PWhiddy @S3mz I have been meaning to take a stab at trying to se if we can take this all the way and beat the game. love the idea of sharing resources. I'm in the middle of getting another server setup as part of my home lab its 2x 6 core cpu with 48gb ram. as well as a kubernetes cluster of four Rasberry Pi each 4 cores and 8gb of ram each.

RWayne93 avatar Oct 19 '23 03:10 RWayne93

@PWhiddy (cc @RWayne93 ) yeah probably simplifying at this stage is a good idea but to do it with a google doc maybe you'll need some coordination with the community. My initial thoughts had in mind two goals:

  1. Beating sections of the game collaboratively and feeding the model with the joint data. This would probably involve organizing the community to run training sections with the same configs in the same areas and feed the model with the joint results before a new combined run

  2. Breakdown the game in quests and allocate training agents to different parts of the game with different init configs to distribute the resources and gather info in different areas that might be useful to fine tune the model and functions and help crack down issues in previous quests.

Just some ideas but willing to help on any other strategy that you find helpful

S3mz avatar Oct 19 '23 09:10 S3mz

Lately i've been thinking that certain sections of the game might also require their own network that then feeds into the over all network.

RWayne93 avatar Oct 19 '23 12:10 RWayne93

Hey All, I liked how this idea also addresses saving, exporting and possibly comparison. I tried to whip up a demo version of something.

  • resume from multiple checkpoints
  • upload to alternative server
  • download from server remote server
  • there is a "highest" endpoint so it allows you to run_baseline_parallel.py with a --highest flag and download the top model

Server code accepts uploads and sorts them via steps but it doesn't really work yet. I think the issue maybe I need to collect data during the training saves and save it into the folder via something like metadata.json about the training parameters to have a better way to compare the best models.

Anyway here are some screenshots where the web server also accepts uploads via the menu and if you refresh it will sort it based on the highest step for now.

Lets me know some better ways to approach these issues.

read from server upload to remote server py server Screenshot 2023-10-19 at 11 21 55 PM

techmore avatar Oct 20 '23 03:10 techmore

@S3mz Newest pull request on this issue. feel free to let me know how it works for you!

https://github.com/PWhiddy/PokemonRedExperiments/pull/118

techmore avatar Oct 24 '23 22:10 techmore