Wikimedia Commons extraction
Implemented first version of Wikimedia Commons Infobox Extractor. I also configured the properties file which contains extractors that are used for data extraction from Wikimedia Commons files.
SonarCloud Quality Gate failed.
0 Bugs
0 Vulnerabilities (and
0 Security Hotspots to review)
9 Code Smells
No Coverage information
73.9% Duplication
@coderabbitai full review
โ Actions performed
Full review triggered.
[!WARNING]
Rate limit exceeded
@JJ-Author has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 15 minutes and 19 seconds before requesting another review.
โ How to resolve this issue?
After the wait time has elapsed, a review can be triggered using the
@coderabbitai reviewcommand as a PR comment. Alternatively, push new commits to this PR.We recommend that you space out your commits to avoid hitting the rate limit.
๐ฆ How do rate limits work?
CodeRabbit enforces hourly rate limits for each developer per organization.
Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.
Please see our FAQ for further information.
๐ฅ Commits
Reviewing files that changed from the base of the PR and between eb0e463d07f68cb2e10ee840b4a13fd5f3ff7a5a and 5b31dd53b8cb45b5f2ba81d69a00405a47b46166.
๐ Files selected for processing (2)
core/src/main/scala/org/dbpedia/extraction/mappings/WikimediaCommonsInfoboxExtractor.scala(1 hunks)dump/extraction.commons.properties(1 hunks)
โจ Finishing touches
๐งช Generate unit tests (beta)
- [ ] Create PR with unit tests
- [ ] Post copyable unit tests in a comment
- [ ] Commit unit tests in branch
gsoc-mykola
Comment @coderabbitai help to get the list of available commands and usage tips.