I want to have a mirror of my local music collection on my server, and a script that periodically updates the server to, well, mirror my local collection.
But crucially, I want to convert all lossless files to lossy, preferably before uploading them.
That’s the one reason why I can’t just use git
- or so I believe.
I also want locally deleted files to be deleted on the server.
Sometimes I even move files around (I believe in directory structure) and again, git deals with this perfectly. If it weren’t for the lossless-to-lossy caveat.
It would be perfect if my script could recognize that just like git does, instead of deleting and reuploading the same file to a different location.
My head is spinning round and round and before I continue messing around with find
and scp
it’s time to ask the community.
I am writing in bash but if some python module could help with it I’m sure I could find my way around it.
TIA
additional info:
- Not all files in the local collection are lossless. A variety of formats.
- The purpose of the remote is for listening/streaming with various applications
- The lossy version is for both reducing upload and download (streaming) bandwidth. On mobile broadband FLAC tends to buffer a lot.
- The home of the collection (and its origin) is my local machine.
- The local machine cannot act as a server
Git is for text files. Your git repo might get very big after some time. Especially if you move files. But it’s your choice. Sounds like your problem can be solved with pre-commit hook
I was thinking about this; probably true for the origin, but I’m sure it can be mitigated or at least minimized and maybe avoided completely for the cloning side?
Moving files does not noticeably increase git repo size. The files are stored as blob objects. Changing their path does not duplicate them.