Scripts icon indicating copy to clipboard operation
Scripts copied to clipboard

Feature request: add trackers from file instead of url

Open teotikalki opened this issue 10 months ago • 9 comments

As the title says. It should be possible to use a local text file instead of a remote url as the tracker list.

teotikalki avatar Mar 02 '25 10:03 teotikalki

One could run a tiny webserver locally.....http://localhost/file.txt

Or on your network http://192.168.1.x/file.txt

Or within a virtual machine (VirtualBox or QEMU)

AtomicRobotMan0101 avatar Mar 02 '25 19:03 AtomicRobotMan0101

Another thought, if one is running qBitTorrent with the web server interface, put the file into it's own file system.... Have it serve it's own file 😄

http://192.168.1.x:8080/file.txt

Don't see why that won't work.

AtomicRobotMan0101 avatar Mar 02 '25 19:03 AtomicRobotMan0101

Hi, @teotikalki I would like to better understand the origin of the request, because this involves modifying the script, and it can be done in two very different ways

  1. add an argument
  2. have the ability to specify a path to a file in the list of trackers to draw from and of course both have "difficulties" to manage

As it stands, the easiest thing you can do is to upload this list to pastebin or similar and then add the link to the trackers list to download

What would be the advantage of using a local file?

Jorman avatar Mar 02 '25 20:03 Jorman

So... one shouldn't need to either create a webserver or use an external service to run a local script whose input is a text file.

The advantage is that it would do it's job without external dependencies.

If I want to change an entry in a text file, I can do that in a terminal with nano on the line before I invoke the script. I shouldn't have to upload that edit somewhere so that it can then be downloaded... it started as a local file and it's being read by a local script (probably to manage another software instance which is running locally).

I believe that your item 1 is more practical. It is definitely the functionality I expected.

<script.sh> -f trackerlist.txt

teotikalki avatar Mar 10 '25 19:03 teotikalki

Mmm local is a strange word, it's torrents, torrents are online, trackers are online, you have to connect online to download. What advantages would a local file have? You have to update it by hand, you have to format it the right way and compatible with the script. A local webserver I don't recommend, and uploading your trackers file online was a suggestion, if you do a pastebin you don't need to edit the local file to upload it modified, you directly edit the file on pastebin.

Always keep in mind that the script doesn't download anything, it's all in memory, nothing is written to disk, not by the script, instead the data for each torrent involved by the trackers update is updated.

But I will tell you more, actually, the functionality you are looking for is already in the script, just edit it, if you use the .sh version: Remove all online listings

declare -a live_trackers_list_urls=(
	"https://newtrackon.com/api/stable"
	"https://trackerslist.com/best.txt"
	"https://trackerslist.com/http.txt"
	"https://raw.githubusercontent.com/ngosang/trackerslist/master/trackers_best.txt"
	)

And then enter your list

cat >"${trackers_list}" <<'EOL'
udp://tracker.coppersurfer.tk:6969/announce
http://tracker.internetwarriors.net:1337/announce
udp://tracker.internetwarriors.net:1337/announce
[...]
http://tracker.tfile.co:80/announce
http://retracker.mgts.by:80/announce
http://peersteers.org:80/announce
http://fxtt.ru:80/announce
EOL

If you use the .py version: Remove all online listings

live_trackers_list_urls = [
    "https://newtrackon.com/api/stable",
    "https://trackerslist.com/best.txt",
    "https://trackerslist.com/http.txt",
    "https://raw.githubusercontent.com/ngosang/trackerslist/master/trackers_best.txt",
]

And then enter your list

            trackers_list = """udp://tracker.coppersurfer.tk:6969/announce
http://tracker.internetwarriors.net:1337/announce
udp://tracker.internetwarriors.net:1337/announce
udp://tracker.opentrackr.org:1337/announce
udp://9.rarbg.to:2710/announce
udp://exodus.desync.com:6969/announce
[...]
http://retracker.mgts.by:80/announce
http://peersteers.org:80/announce
http://fxtt.ru:80/announce
"""

In theory if there are no lists to download, it will use the static list, at which point, if it does not work, I will make a correction, in that if it cannot download lists online, it must use the static, and for all intents and purposes if there are no lists specified, it must then behave the same way. So you have exactly what you are looking for, a local list that you can control

Jorman avatar Mar 10 '25 21:03 Jorman

I'll respectfully agree to disagree.

Always keep in mind that the script doesn't download anything, it's all in memory, nothing is written to disk, not by the script, instead the data for each torrent involved by the trackers update is updated.

If you're using the data then you've downloaded the data. You're downloading it to memory and using it there instead of downloading it to disk and then loading it into memory and doing the exact same thing.

But I will tell you more, actually, the functionality you are looking for is already in the script, just edit it,

The instructions you give DON'T provide the functionality I'm looking for. I want to download one or more lists and save them as text files and then run the script using that (those) text files. I don't want to manually enter each item from the tracker list into the script itself and I certainly don't want to be overriding existing functionality to do it.

So you have exactly what you are looking for,

See above.

You previously asked what the advantage was to using a local file. Here's one: with your method you download the script again every time the script is called. That means that you make a download of the same file every single time you add a torrent. This is inefficient, wasteful, and an argument can be made that it's a security profiling risk.

I've got a better question to throw back at you: what advantage is there to making your script dependent on the cloud? From where I stand every dependency that is eliminated is a plus and remote dependencies are exponentially more so.

Although I honestly can't comprehend why, it seems clear that you have not only no interest in what I thought to be a functionality so simple and obvious that I don't think it should even be called a 'feature'... you appear to not be able to even understand why such a thing would be wanted. That plus your response tells me that if I want this to happen I will need to code it myself, at which point I thank you for giving me a starting point.

Thanks for taking the time to respond. Even if I find your mindset to be incomprehensible I appreciate that your attitude is at least inclined towards the helpful.

teotikalki avatar Mar 12 '25 12:03 teotikalki

Hi, maybe I misunderstood, so many people are concerned about files being continuously written and read to the hard drive, that's why I used the expression that no files are "downloaded" meant as written to the hard drive, anyway you get what I mean, there are no unnecessary writes to disk, because in the beginning I was speculating that this could also be a possible problem.

Editing the script is not forbidden, in fact it is necessary, one has to set the data for qBittorrent access and if one wants, set the sources for one's own lists

Let us come to the rest: The instructions I gave you, are to substitute the static list you have for the one that is set up in the script. The static list in the script, is only needed in case there is only one list online that is not working, it is to provide a minimum of functionality

Actually, the very early versions of the script, saved the file locally and only once a day was it checked/updated, but in the interest of not saving anything to disk, that feature was removed How can you say it is inefficient and wasteful? If one downloads one torrent a day then it's great, whereas if one downloads one torrent a minute then it's useless, so where is the right fit, from the way you talk it's only right as you say, it doesn't seem very honest as an opinion. I'm not a super computer security expert, what kind of security problems are there? Please explain. If there are security problems, you don't have to use it, it's not like if you use it little then there are less problems, if there are flaws, there are always flaws.

Do you want advantages? Online lists are already formatted correctly Online lists are constantly updated Online lists if not available, given the vastness of online, there will always be others ...

I don't want to pick you apart, but even you depend on online resources, the trackers you put in your local files, you get them from the internet and each time you have to update them by hand, the trackers change all the time, often even online optimized lists contain broken trackers, let alone lists saved locally.

If it were as required as you say, why has no one ever mentioned it? Why don't you use qBittorrent's built-in functionality? Image

I am always willing to implement new functionality if it makes sense, if it is an improvement if it is a fix for a bug I had not noticed, etc., but this functionality does not seem to me to take the script to a better level, rather potentially adding a feature that can create problems, for example:

I add -z file1 then always you or someone realizes that it may be more convenient to also have the option of -z file1 file2 fileN then still someone thinks it is more convenient to have -z one_entire_folder and so on

Then, I would have to check that the file exists, that the file is formatted correctly, etc. etc., all things that I don't have to do with lists already made online

So, for the time being, I don't think this is a feature that will raise the level of the script, then maybe in the future, if this feature becomes in demand by the masses then it will be evaluated with the time it deserves

Jorman avatar Mar 12 '25 22:03 Jorman

maybe I misunderstood, so many people are concerned about files being continuously written and read to the hard drive,

Who? People who are operating on a RasPi with SDCard-only storage? How often do trackerlists change in your world?

The instructions I gave you, are to substitute the static list you have for the one that is set up in the script.

Let's say I have more than one list, like one for clearnet torrents and one for i2p torrents. It makes sense to keeps lists as discreet lists... even the git repos that I've seen that have trackers lists have several to choose from based on differing needs.

Actually, the very early versions of the script, saved the file locally and only once a day was it checked/updated, but in the interest of not saving anything to disk, that feature was removed

So from this I infer that you ran some daily cron check to get the list?

Do you want advantages? Online lists are already formatted correctly Online lists are constantly updated

You seem to think that I'm doing something other than downloading such an online list...

Online lists if not available, given the vastness of online, there will always be others

If you save a local copy then it's available forever...

I don't want to pick you apart, but even you depend on online resources, the trackers you put in your local files, you get them from the internet and each time you have to update them by hand, the trackers change all the time, often even online optimized lists contain broken trackers, let alone lists saved locally.

You aren't picking me apart, you're failing to comprehend me. I've never claimed to not depend on online resources; we're talking about using a script sourced online to apply a list sourced online to affect the performance of software used exclusively to download.

you get them from the internet and each time you have to update them by hand

Says who? I might get them from the internet and never touch them. It's still advantageous to get them once and use them 1000 times.

I might want to edit any given list exactly once, say to add a tracker that I know of that isn't on the netlist. Desiring netlist+1 doesn't seem like too much of an edge case to be considered.

Why don't you use qBittorrent's built-in functionality?

I do... all the time. I found your script while looking to automate exactly that functionality so that I could add my own curated trackerlist to every torrent I add to qB programmatically instead of manually.

Then, I would have to check that the file exists, that the file is formatted correctly, etc. etc., all things that I don't have to do with lists already made online

How does this make sense? Why wouldn't you handle a list EXACTLY the same no matter where it came from? If you don't have to do them with lists found online it's because you're assuming that they're formatted correctly. Why would you assume that you need to do ANYTHING different with a local file? If you're already getting a list from an online source then processing a local file is exactly as complicated as doing the exact same thing you're already doing but with the text sourced from the local file. If you curl trackerlist and then run the script with the hypothetical -f then you're running the same logic on the same data.

You seem to have mentally wandered into the woods and are imagining the need for functionality that was never desired. If you need to validate a list then you need to do that no matter the source. If you don't then you don't need to do that no matter the source. Even the interface in qB it says 'enter utorrent compatible list url'. qB doesn't validate, it tells you do use a valid source. If you wanted to make help text for the -f option you could say something like, oh, ''enter utorrent compatible list url' and you'd be just as valid as the polished torrent client itself.

To be clear: I never in any way that I was aware of asked for you to add 'the ability to input lists of random urls and have them automatically validated and formatted into a utorrent compatible list'. I asked for the ability to to exactly what was already being done with the exact same data from a local file instead of a url.

I'm not a super computer security expert, what kind of security problems are there? Please explain.

Downloading a trackerlist, especially from a tracked central source, is fingerprintable activity. Downloading the same file a hundred times from github is a lot more suspicious than, say, NOT doing that. An intelligent use pattern would be to communicate with such a source as few times as possible because every time you do you're informing a potentially hostile party that you're interested in a potentially sanctioned resource. If a random sampling of people download that file exactly once then they are probably interested in torrenting and they may be actively doing it... if a given party downloads that file a hundred times then they're almost certainly doing it.

teotikalki avatar Mar 14 '25 12:03 teotikalki

Who? People who are operating on a RasPi with SDCard-only storage? How often do trackerlists change in your world?

I don't know, but like before, Why not worry about that as well?

Let's say I have more than one list, like one for clearnet torrents and one for i2p torrents. It makes sense to keeps lists as discreet lists... even the git repos that I've seen that have trackers lists have several to choose from based on differing needs.

No problem, just "clone" the script, like Clearnet_AddqBittorrentTrackers.py i2P_AddqBittorrentTrackers.py

Or call them as like you prefer and run the one you need

So from this I infer that you ran some daily cron check to get the list?

Not necessary, the old script saved the file on disk and check hash

You seem to think that I'm doing something other than downloading such an online list...

You asked for benefits, I gave you benefits, nothing more

If you save a local copy then it's available forever...

True but you miss the point, trackers are not static, born, die, sleep, disappear, and so on, the list, online list are mutable, very quick, some more than one time at day. Your static list are always the same? Perfect, edit and use the static list inside the script (now is working I just fix a little problem)

You aren't picking me apart, you're failing to comprehend me. I've never claimed to not depend on online resources; we're talking about using a script sourced online to apply a list sourced online to affect the performance of software used exclusively to download.

What kind of performance? Obviously there are differences between local and online use, but of very negligible, obviously dependent on the system used by the user, but we are talking about tenths of a second, which should not worry even those using older hardware

Injecting...  But before a quick cleaning the existing trackers...
done, injected 59 trackers!
./jjj.py -n silent -c  0,11s user 0,02s system 29% cpu 0,453 total
Injecting... The URL list is empty. Using the static tracker list.
 But before a quick cleaning the existing trackers...
done, injected 59 trackers!
./jjj.py -n silent -c  0,08s user 0,02s system 91% cpu 0,110 total

Same list of trackers, online version and static version, obviously there are differences, I leave the judgment to those who read. I personally think that if these differences impair the operation or put a strain on the pc running the script, then perhaps that system cannot handle even a torrent

Says who? I might get them from the internet and never touch them. It's still advantageous to get them once and use them 1000 times.

I might want to edit any given list exactly once, say to add a tracker that I know of that isn't on the netlist. Desiring netlist+1 doesn't seem like too much of an edge case to be considered.

Make up your mind, do you use them like this forever or do you edit them? I repeat, I am not forcing you to use online resources, everyone has his own opinion, for me they are convenient, for others not, etc. Well, I urge you to use the static list, it can be used for exactly what you are looking for

How does this make sense? Why wouldn't you handle a list EXACTLY the same no matter where it came from? If you don't have to do them with lists found online it's because you're assuming that they're formatted correctly. Why would you assume that you need to do ANYTHING different with a local file? If you're already getting a list from an online source then processing a local file is exactly as complicated as doing the exact same thing you're already doing but with the text sourced from the local file. If you curl trackerlist and then run the script with the hypothetical -f then you're running the same logic on the same data.

You seem to have mentally wandered into the woods and are imagining the need for functionality that was never desired. If you need to validate a list then you need to do that no matter the source. If you don't then you don't need to do that no matter the source. Even the interface in qB it says 'enter utorrent compatible list url'. qB doesn't validate, it tells you do use a valid source. If you wanted to make help text for the -f option you could say something like, oh, ''enter utorrent compatible list url' and you'd be just as valid as the polished torrent client itself.

To be clear: I never in any way that I was aware of asked for you to add 'the ability to input lists of random urls and have them automatically validated and formatted into a utorrent compatible list'. I asked for the ability to to exactly what was already being done with the exact same data from a local file instead of a url.

I'd rather provide a working script, you never know what people come up with, they think they're passing random files and then they complain that it doesn't work, so I see your point, but it's not the same thing, if you program, and I think you do, you can't tell me it's the same thing, because if I format a list for you with two different separators the functions to format it are different

Downloading a trackerlist, especially from a tracked central source, is fingerprintable activity. Downloading the same file a hundred times from github is a lot more suspicious than, say, NOT doing that. An intelligent use pattern would be to communicate with such a source as few times as possible because every time you do you're informing a potentially hostile party that you're interested in a potentially sanctioned resource. If a random sampling of people download that file exactly once then they are probably interested in torrenting and they may be actively doing it... if a given party downloads that file a hundred times then they're almost certainly doing it.

It sounds like you are concerned about security, right, that's not a criticism, quite the contrary! So I'm wondering, you're concerned about the recovery of an online list, a list that contains urls, which so by itself means nothing, but you're not concerned about torrent traffic or data volume? So I'm assuming you use a vpn, and if you use a vpn to disguise it, why do you worry about retrieving an online list? But those are your business, again, you have your beliefs as I have mine, but I urge you for the last time to use the static list in the script

Jorman avatar Mar 14 '25 23:03 Jorman