cpp icon indicating copy to clipboard operation
cpp copied to clipboard

Added "book-store" exercise

Open marcelweikum opened this issue 8 months ago • 8 comments

Hey,

I added the book-store exercise with the unified pragma mark. (Had whole lot of time today)

Kind regards, Marcel

marcelweikum avatar May 09 '25 14:05 marcelweikum

Hello. Thanks for opening a PR on Exercism 🙂

We ask that all changes to Exercism are discussed on our Community Forum before being opened on GitHub. To enforce this, we automatically close all PRs that are submitted. That doesn't mean your PR is rejected but that we want the initial discussion about it to happen on our forum where a wide range of key contributors across the Exercism ecosystem can weigh in.

You can use this link to copy this into a new topic on the forum. If we decide the PR is appropriate, we'll reopen it and continue with it, so please don't delete your local branch.

If you're interested in learning more about this auto-responder, please read this blog post.


Note: If this PR has been pre-approved, please link back to this PR on the forum thread and a maintainer or staff member will reopen it.

github-actions[bot] avatar May 09 '25 14:05 github-actions[bot]

For future PRs: what’s the preferred way to give a heads-up before making changes (e.g., adding a new exercise)? As far as I know, the expectations vary between different tracks and maintainers, so I’d like to ask what the best approach would be here. Until today I was opening issues to inform about changes I'm planning to make.

marcelweikum avatar May 13 '25 19:05 marcelweikum

If there is just a few seconds between the PR and the issue I would not bother and go directly for the PR.

If it is something with a lot of work for you, that might be turned down (like updating everything to a new test framework), you should open an issue for discussion first and ping us. For me, I'm fine with GitHub only, but you will get a wider audience on the forums.

I see this mostly to save you from spending time on things that are not necessary.

Also, half done and broken PRs are a pain to guide, moreso if people abandon them and we have more work piecing the pieces together than doing it from scratch. This doesn't seem to be the case with your current PRs though.

If I'm late with a reply, I don't mind another ping. Life can be busy.

vaeng avatar May 13 '25 20:05 vaeng

Thanks for the info! I’ve usually opened an issue and then immediately started working on the exercise, just so someone knows that it’s being worked on. Most of the time, it doesn’t take me longer than 30 minutes to an hour to finish. That’s why I just posted the next two to three PRs without opening issues first, I hope that’s okay.

My general plan is to go through all current and available exercises from the problem-specifications repo (excluding the deprecated ones) in alphabetical order and implement them for the C++ track. It’s a great way for me to practice, I really enjoy it, and it helps expand the track a bit.

Would that approach be okay with you long-term, or is there anything I should keep in mind or reconsider?

marcelweikum avatar May 14 '25 13:05 marcelweikum

My general plan is to go through all current and available exercises from the problem-specifications repo (excluding the deprecated ones) in alphabetical order and implement them for the C++ track. It’s a great way for me to practice, I really enjoy it, and it helps expand the track a bit.

I don't have that much time reviewing currently, so be prepared to wait a bit. Other than that extending the track sounds great! Thanks for the effort!

Would that approach be okay with you long-term, or is there anything I should keep in mind or reconsider?

From my perspective sounds great in general. However, could you maybe open a tracking issue first? We had something similar for the #48in24 exercises (#779). First, having an overview of the missing exercises would be nice to have. Second, we could put people's names to "claim" an exercise. I suspect @marcelweikum will claim most, but maybe @vaeng or myself want to implement one exercise or another as well.

Another topic is around tooling. @vaeng has some exercise generator on a branch. I'm not sure what's missing there, but finishing that up and getting it on main would be good IMHO. In addition to having something that provides a skeleton, it would be great to have tooling that automatically creates the tests. The Python track has something for this. For each exercise, there's a Jinja template in .meta. For the arm64-assembly track, we have generators written in pure Python. I'm not sure which approach I like better. The template one is maybe a bit cleaner. IMHO we should aim for 100% generated tests, as that would make syncing of exercises much easier and less error-prone. We need a solution for cases where we deviate from the canonical data. For arm64-assembly, for some exercises extra test cases are simply put into the generator code. The approach on the Python track has probably something similar.

@marcelweikum would you be up for working on something like that as well? I think it would make sense to get the tooling in place before adding many more exercises to make sure they match the canonical data and also to avoid extra work: a retrospectively added generator only helps when syncing next time, while having a generator first can also create the initial test cases.

I understand if that's not your cup of tea. In that case I'd try to prioritze this myself and come up with some framework. Wdyt?

ahans avatar May 14 '25 14:05 ahans

Hey @ahans,

First, having an overview of the missing exercises would be nice to have.

That’s a great idea! If no one objects, I’d be happy to go through all exercises in the problem-specifications repo, exclude the deprecated ones, and create an issue that lists the current missing ones. I’d also keep it up to date so it’s easy to see what’s been claimed or completed.

It would be great to have tooling that automatically creates the tests.

I agree! Long-term, that would be a great step forward and definitely makes sense to reduce manual effort and errors. I’m genuinely interested in working on that. That said, I currently don’t have a solid starting point for how the generators work, apart from using configlet to scaffold exercises, which I believe is a similar idea, just without the test generation from canonical data.

If you or @vaeng could explain briefly what’s missing or where to begin, I’d love to dive into it. I’d need to read up and experiment a bit, but I do have the time and motivation to help with that!

I’m really enjoying contributing to the track and appreciate the opportunity to help shape it.

marcelweikum avatar May 14 '25 15:05 marcelweikum

There is a wiki article about the test and exercise generator in this repo. I'm on mobile, where it is difficult to find.

I like the Python templating, for maintenance, but boy, it's not straightforward to set up. I translated some exercises for the Python track and it took me more time to fiddle with the templating than writing it manually. Each exercise has a bit of nuance in its template so it was error prone for me as a newcomer.

vaeng avatar May 14 '25 16:05 vaeng

There is a wiki article about the test and exercise generator in this repo. I'm on mobile, where it is difficult to find.

I think I found it: https://github.com/exercism/cpp/wiki/Test-Cases-for-Practice-Exercises

Each exercise has a bit of nuance in its template so it was error prone for me as a newcomer.

Sounds promising! 😆 But I'll try reading into it and also trying to dive into your exercise_creation_helper.py

marcelweikum avatar May 14 '25 17:05 marcelweikum