Tag Section Returning Same Results Repeatedly
Please follow the guide below
- Issues submitted without this template format will be ignored.
- Rlease read them carefully and answer completely.
- Do not post screenshots of error messages or code.
- Put an
xinto all the boxes [ ] relevant to your issue (==> [x] no spaces). - Use the Preview tab to see how your issue will actually look like.
- Issues about reverse engineering is out of scope and will be closed without response.
- Any mention of spam-like actions or spam-related tools/libs/etc is strictly not allowed.
Before submitting an issue, make sure you have:
- [X] Updated to the lastest version v1.6.0
- [X] Read the README and docs
- [X] Searched the bugtracker for similar issues including closed ones
- [X] Reviewed the sample code in tests and examples
Which client are you using?
- [X] app (
instagram_private_api/) - [ ] web (
instagram_web_api/)
Describe your Question/Issue:
My issue is that using the tag_section() function constantly returns the same posts after I iterate over 3 pages. For example if I were to search the term 'cats' or 'pet', paginating constantly and extracting the media pieces from within the call, I would get repeat results after page 3. This messed me up the first time around and got an account blocked after Instagram considered some suspicious activity (not to mention it messed with my statistical analysis I was doing). At the time I was using next_max_id and only by switching to the page parameter did my results only slightly improve but I am faced with the same issue as I listed above.
The way I know this is I decided to keep track of the media codes, and just as a test, ran through 6 pages of the 'pet' tag, storing the media results, and out of 177 results, only 32 were unique (in other words the rest were duplicates). Does anyone have a method of approaching this (maybe I am doing something wrong here)? Is there a limit on how many unique pieces of content can be on a single tag?
So far the issue has caused the collection of some basic public data to be strenuously long, and lead to a couple of my accounts being flagged, without collecting much new data, but rather iterating over already collected posts.
If anyone has an approach I am all ears.