Add support for custom storage backends
Your README states:
If you want to use a database instead, ask for it, we'll come up with some kind of a plugin system.
I'd like to run sinopia on Heroku, which doesn't have a persistent filesystem. I haven't used sinopia yet at all -- how hard would it be to add support for something like S3, postgres, redis, or otherwise?
We're dealing with a lot of binary files (tarballs), and they need to be stored somewhere more or less efficiently. Mongodb will probably be a good choice because of gridfs. I'm not sure about others.
Ideally, just take local-fs.js file and rewrite it to use a database. Unfortunately, because of a few optimizations, it won't be that simple.
I have the exact same use case -- I'd ideally like to run this on Heroku, which means a mongo (or postgres) backend would simplify things greatly for me.
Are there any modules for postgres/mongodb implementing standard node fs interface?
None that I'm aware of or can find with a quick googling.
I think leveldb has a lot of potential to be used as a 'pluggable' backend. I used dat in my last project, and their 'showcase demo' is the npm registry :)
Internally dat has two kinds of data storage: tabular and blob. The default tabular data store is LevelDB and the default blob store stores files on the local filesystem. Both of these default backends can be swapped out for other backends.
Some of the modules they built on top of leveldb for dat are pretty impressive. I'm currently hosting sinopia locally and in the cloud. With something like dat's replication protocol, it'd be possible to sync the local registry with the cloud's, or even cloud-cloud. I think leveldb could help bridge the gap between node's fs-streaming module and databases like mongo / postgres, while still using fs streams at it's core.
Just my 0.02, discovered sinopia yesterday :)
+1
looking into implementing Sinopia but concerned about scaling and being able to tap into Mongo for storage would be ideal.
The project looks very promising!
+1
+1
We need to setup sinopia to use S3
+1 same here :-)
I suppose vinyl could be used, as they already have loads of adapters (including vinyl-s3).
PS: great project. I first headed over cnpm but sinopia is way easier and slicker.
+1
+1, I've started some work on a mongodb backend (with the advice stated here). @rlidwka Do you already have some idea how you want a pluggable architecture? How do users to switch between the local filesystem to a mongo db (or other item)?
Should switching to a mongo db be a:
- configuration option
- a separate pluggable npm repo (for example
npm install sinopia-mongo-plugininstalls the backend and sinopia knows to look for it and use it if it exists) - Fork of sinopia called
sinopia-mongo
@SyntaxRules , I was thinking about it as a plugin, same as auth plugins. So kinda second option. You add something like:
fs:
sinopia-mongo-plugin:
database: blah
password: blahblah
First option is out, because we really don't need mongodb dependency here. And a fork is hard to maintain usually.
There is another way of doing this: using io.js with -r option (see https://github.com/iojs/io.js/pull/881). This way you can write a driver to intercept any fs calls from any application (not just sinopia) and redirect them to mongo. So if you have similar issues with other node modules, and don't want to ask for db support in all of them, this might be a way to go.
When can I get sinopia-mongo module?
+1 for S3 backend
+1 for S3 modular storage
For all the folks looking for S3 support, this may be helpful: https://github.com/jbuck/npm-readonly-mirror
@zeke thanks for the tip, although I've already knew it, I think mostly here are seeking for the private feature :p
+1 for S3
:+1: for S3, please if you add a mongo db backend make it so we can use something else easily (and completely)
+1 for S3
+1 for S3
if you use docker - it might be easier to store the data in a separate volume and then use that to sync with s3 - sinopia has an amazing codebase, but it's rather complicated (at least for me) - so IMO like using containers seems simpler. - unless of course somebody has already built this storage module :P
On Thu, Jan 7, 2016 at 5:14 AM Andrew Stilliard [email protected] wrote:
+1 for S3
— Reply to this email directly or view it on GitHub https://github.com/rlidwka/sinopia/issues/20#issuecomment-169660393.
I just googled 'docker s3 storage' and found a number a recourses actually :)
https://docs.docker.com/registry/storage-drivers/s3/ https://github.com/whatupdave/docker-s3-volume https://hub.docker.com/r/yaronr/backup-volume-container/
It's fairly simple to expose the folder which holds all the files, and you can share that volume container with something running 'node vinyls' or 'python' or something which utilizes the s3 API - and the shared container will be able to update, remove, and store content without having to shut down the server.
Edit: I explain how to expose the volume here
docker run --name sinopia -d -p 4873:4873 -v <local-path-to-storage>:/sinopia/storage rnbwd/sinopia
I maybe changed 2 lines of code from this repository to get it working nicely with docker (I had to change some default configs and know exactly where the storage folder is located (slightly different behavior than the normal module), everything else is the same
https://docs.docker.com/registry/storage-drivers/s3/
docker registry's s3 storage (useless here)
https://github.com/whatupdave/docker-s3-volume
doesn't offer updates from S3 (in case other instance updates a file)
https://hub.docker.com/r/yaronr/backup-volume-container/
Same as before, it will watch or local changes, but not for remote ones.
The best choice we had until now was to use ours with btsync.
@cusspvz those i shared those links to provide examples of different implementations for docker - but btsync looks really interesting, I thanks for sharing.
I'm not opposed to a sinopia plugin for s3, but this has been discussed for over a year, and i've gone over the source a few times, even implemented a few plugins (nothing to do with s3), but I honestly have no idea where to begin. I know this is possible with docker today, I don't know how close anyone is to developing an s3 plugin for sinopia at the moment.
Edit: this issue was filed 2 years ago
+1 for S3
+1 for S3
I am looking to implement it as follows
- create an s3 bucket in us-west-2
- mount that bucket on my sinopia server using s3fs ( https://github.com/s3fs-fuse/s3fs-fuse )
- create an s3 bucket in us-east-2 (failover)
- mount that bucket read-only on the failover sinopia server Setup inter-region S3 replication to copy the data.