sinopia icon indicating copy to clipboard operation
sinopia copied to clipboard

Add support for custom storage backends

Open zeke opened this issue 12 years ago • 28 comments

Your README states:

If you want to use a database instead, ask for it, we'll come up with some kind of a plugin system.

I'd like to run sinopia on Heroku, which doesn't have a persistent filesystem. I haven't used sinopia yet at all -- how hard would it be to add support for something like S3, postgres, redis, or otherwise?

zeke avatar Dec 08 '13 02:12 zeke

We're dealing with a lot of binary files (tarballs), and they need to be stored somewhere more or less efficiently. Mongodb will probably be a good choice because of gridfs. I'm not sure about others.

Ideally, just take local-fs.js file and rewrite it to use a database. Unfortunately, because of a few optimizations, it won't be that simple.

rlidwka avatar Dec 08 '13 12:12 rlidwka

I have the exact same use case -- I'd ideally like to run this on Heroku, which means a mongo (or postgres) backend would simplify things greatly for me.

zacronos avatar Aug 14 '14 21:08 zacronos

Are there any modules for postgres/mongodb implementing standard node fs interface?

rlidwka avatar Aug 14 '14 22:08 rlidwka

None that I'm aware of or can find with a quick googling.

zacronos avatar Aug 14 '14 22:08 zacronos

I think leveldb has a lot of potential to be used as a 'pluggable' backend. I used dat in my last project, and their 'showcase demo' is the npm registry :)

Internally dat has two kinds of data storage: tabular and blob. The default tabular data store is LevelDB and the default blob store stores files on the local filesystem. Both of these default backends can be swapped out for other backends.

Some of the modules they built on top of leveldb for dat are pretty impressive. I'm currently hosting sinopia locally and in the cloud. With something like dat's replication protocol, it'd be possible to sync the local registry with the cloud's, or even cloud-cloud. I think leveldb could help bridge the gap between node's fs-streaming module and databases like mongo / postgres, while still using fs streams at it's core.

Just my 0.02, discovered sinopia yesterday :)

RnbWd avatar Nov 16 '14 01:11 RnbWd

+1

looking into implementing Sinopia but concerned about scaling and being able to tap into Mongo for storage would be ideal.

The project looks very promising!

knksmith57 avatar Nov 20 '14 22:11 knksmith57

+1

scamden avatar Dec 15 '14 19:12 scamden

+1

We need to setup sinopia to use S3

mrjackdavis avatar Jan 05 '15 04:01 mrjackdavis

+1 same here :-)

I suppose vinyl could be used, as they already have loads of adapters (including vinyl-s3).

PS: great project. I first headed over cnpm but sinopia is way easier and slicker.

thom4parisot avatar Jan 21 '15 19:01 thom4parisot

+1

stephengfriend avatar Apr 16 '15 18:04 stephengfriend

+1, I've started some work on a mongodb backend (with the advice stated here). @rlidwka Do you already have some idea how you want a pluggable architecture? How do users to switch between the local filesystem to a mongo db (or other item)?

Should switching to a mongo db be a:

  1. configuration option
  2. a separate pluggable npm repo (for example npm install sinopia-mongo-plugin installs the backend and sinopia knows to look for it and use it if it exists)
  3. Fork of sinopia called sinopia-mongo

SyntaxRules avatar Apr 23 '15 16:04 SyntaxRules

@SyntaxRules , I was thinking about it as a plugin, same as auth plugins. So kinda second option. You add something like:

fs:
  sinopia-mongo-plugin:
    database: blah
    password: blahblah

First option is out, because we really don't need mongodb dependency here. And a fork is hard to maintain usually.

There is another way of doing this: using io.js with -r option (see https://github.com/iojs/io.js/pull/881). This way you can write a driver to intercept any fs calls from any application (not just sinopia) and redirect them to mongo. So if you have similar issues with other node modules, and don't want to ask for db support in all of them, this might be a way to go.

rlidwka avatar Apr 24 '15 14:04 rlidwka

When can I get sinopia-mongo module?

iamdenny avatar Apr 30 '15 01:04 iamdenny

+1 for S3 backend

philmander avatar Jul 27 '15 12:07 philmander

+1 for S3 modular storage

cusspvz avatar Sep 30 '15 21:09 cusspvz

For all the folks looking for S3 support, this may be helpful: https://github.com/jbuck/npm-readonly-mirror

zeke avatar Oct 01 '15 16:10 zeke

@zeke thanks for the tip, although I've already knew it, I think mostly here are seeking for the private feature :p

cusspvz avatar Oct 01 '15 17:10 cusspvz

+1 for S3

RemoteCTO avatar Nov 18 '15 12:11 RemoteCTO

:+1: for S3, please if you add a mongo db backend make it so we can use something else easily (and completely)

mcansky avatar Dec 11 '15 11:12 mcansky

+1 for S3

thomaspapiernik avatar Dec 11 '15 11:12 thomaspapiernik

+1 for S3

stilliard avatar Jan 07 '16 13:01 stilliard

if you use docker - it might be easier to store the data in a separate volume and then use that to sync with s3 - sinopia has an amazing codebase, but it's rather complicated (at least for me) - so IMO like using containers seems simpler. - unless of course somebody has already built this storage module :P

On Thu, Jan 7, 2016 at 5:14 AM Andrew Stilliard [email protected] wrote:

+1 for S3

— Reply to this email directly or view it on GitHub https://github.com/rlidwka/sinopia/issues/20#issuecomment-169660393.

RnbWd avatar Jan 08 '16 04:01 RnbWd

I just googled 'docker s3 storage' and found a number a recourses actually :)

https://docs.docker.com/registry/storage-drivers/s3/ https://github.com/whatupdave/docker-s3-volume https://hub.docker.com/r/yaronr/backup-volume-container/

It's fairly simple to expose the folder which holds all the files, and you can share that volume container with something running 'node vinyls' or 'python' or something which utilizes the s3 API - and the shared container will be able to update, remove, and store content without having to shut down the server.

Edit: I explain how to expose the volume here

docker run --name sinopia -d -p 4873:4873 -v <local-path-to-storage>:/sinopia/storage rnbwd/sinopia

I maybe changed 2 lines of code from this repository to get it working nicely with docker (I had to change some default configs and know exactly where the storage folder is located (slightly different behavior than the normal module), everything else is the same

RnbWd avatar Jan 08 '16 04:01 RnbWd

https://docs.docker.com/registry/storage-drivers/s3/

docker registry's s3 storage (useless here)

https://github.com/whatupdave/docker-s3-volume

doesn't offer updates from S3 (in case other instance updates a file)

https://hub.docker.com/r/yaronr/backup-volume-container/

Same as before, it will watch or local changes, but not for remote ones.

The best choice we had until now was to use ours with btsync.

cusspvz avatar Jan 08 '16 13:01 cusspvz

@cusspvz those i shared those links to provide examples of different implementations for docker - but btsync looks really interesting, I thanks for sharing.

I'm not opposed to a sinopia plugin for s3, but this has been discussed for over a year, and i've gone over the source a few times, even implemented a few plugins (nothing to do with s3), but I honestly have no idea where to begin. I know this is possible with docker today, I don't know how close anyone is to developing an s3 plugin for sinopia at the moment.

Edit: this issue was filed 2 years ago

RnbWd avatar Jan 09 '16 02:01 RnbWd

+1 for S3

swordeh avatar Jun 16 '16 12:06 swordeh

+1 for S3

nopol avatar Dec 16 '16 08:12 nopol

I am looking to implement it as follows

  • create an s3 bucket in us-west-2
  • mount that bucket on my sinopia server using s3fs ( https://github.com/s3fs-fuse/s3fs-fuse )
  • create an s3 bucket in us-east-2 (failover)
  • mount that bucket read-only on the failover sinopia server Setup inter-region S3 replication to copy the data.

twellspring avatar Feb 10 '17 01:02 twellspring