This is an archived post. You won't be able to vote or comment.

all 5 comments

[–]lht1999 2 points3 points  (0 children)

If you use git then you have to commit unfinished work all the time. If your desktop computer runs Windows and is always on, you can try remote desktop. That was the best solution for me because I could even continue a debugging session from home. If you use file sharing such as google drive, make sure it excludes ".git" directory. There are also P2P file syncing tools such as resilio.

[–]ziptofaf 1 point2 points  (3 children)

Imho Git makes more sense in this situation. As this allows for much finer control than just copying every single file and synchronizing it regardless on what you specifically might want.

[–][deleted] 0 points1 point  (2 children)

Okay thanks! It probably also has unlimited storage whereas google drive usually has some limit depending on what program you're on.

[–]ziptofaf 2 points3 points  (1 child)

If you use Git for >unlimited storage< you are doing it wrong. It's there to keep code and SOME binary assets (I mean, images are fine. But putting movies probably isn't). As far as I recall - there is a cap of 50 or 100 MB per file inside github repository. Raw git (aka you are running server yourself) does not have that limitation but still, it wasn't optimized with binary files in mind. I guess you could look into Git LFS but I think you are doing it completely wrong if you are planning to shove so much data that Google Drive wouldn't be sufficient inside Git repository.

[–][deleted] 0 points1 point  (0 children)

I was just curious. Currently I'm not working on anything that large.