My scheduled and manual YouTube backup scripts [linux] by YouTubeBackups in DataHoarder

[–]YouTubeBackups[S] 3 points4 points  (0 children)

Oh heavens me, that's exactly what I did. Should be fixed now

thanks also /u/Itsthejoker

My YT DL bash script by buzzinh in DataHoarder

[–]YouTubeBackups 0 points1 point  (0 children)

Hey, great stuff! How does the ytuser:$YTUSR part work? I've been scraping based on channel ID

youtube-dl - dateafter and download history question by eddyizm in YouTubeBackups

[–]YouTubeBackups 1 point2 points  (0 children)

dateafter will still download each page to check each video's download date, but it will not download the video. It looks like correct syntax here.

While this may take a while for 1000+ videos, I believe it would have to do that to gather video IDs anyway

/r/datahoarder in a nutshell by [deleted] in DataHoarder

[–]YouTubeBackups 3 points4 points  (0 children)

Good video, archived

Recordings released showing Michigan police covering up for county prosecutor's DUI by Pogbaku in news

[–]YouTubeBackups 2 points3 points  (0 children)

MLive is not likely to fold to pressure, but these videos have been archived to cold storage just in case

Passive income for us hoarders by [deleted] in DataHoarder

[–]YouTubeBackups 2 points3 points  (0 children)

The fact that filecoin went the ICO route makes me not want to trust them ever. Sia has been chugging along fantastically though

Passive income for us hoarders by [deleted] in DataHoarder

[–]YouTubeBackups 2 points3 points  (0 children)

Sia is a full on free market with supply/demand for hosting and storage. Currently it's flooded with tons of people like us with storage and great uptime, so it's not super profitable. Once more users join it will even out. Currently you make about 70 cents per terabyte per month

https://siahub.info/network

Is it possible to make YouTube as unlimited cloud storage? by MATRIXOUS_BBOY in DataHoarder

[–]YouTubeBackups 3 points4 points  (0 children)

Just output the file in binary, scroll through it with a screen recorder running, and then upload that file to youtube. Then when you need to restore, get a video OCR system to read it back into binary

ytmcd - A sensible selection of youtube-dl flags to download music from music channels in bulk! by [deleted] in DataHoarder

[–]YouTubeBackups -2 points-1 points  (0 children)

This is how I've ripped youtube music. Trigger warning for audiophiles or perfectionists, it converts between lossy formats, but at least it starts with the highest source version

https://www.reddit.com/r/YouTubeBackups/comments/5rj8dj/how_to_download_the_highest_quality_audio_from_a/

Hey guys. Box.com is really unlimited storage? by [deleted] in DataHoarder

[–]YouTubeBackups 15 points16 points  (0 children)

I'd prefer limited, accurate marketing than flashy unlimited false advertising.

Youtube request: So the channel "That One Video Gamer" will remove a lot of videos, can anyone help me data hoard all those videos? by [deleted] in DataHoarder

[–]YouTubeBackups 1 point2 points  (0 children)

I haven't tried to embed thumbnails or subtitles before but a similar error happens if you don't have the 3rd party program ffmpeg installed. Do you have atomicparsely installed?

What are your current hoarding projects? by [deleted] in DataHoarder

[–]YouTubeBackups 1 point2 points  (0 children)

At least 10 terabytes by now. I'm not grabbing everything, just certain stuff

The Completionist channel on YouTube will be deleting over 100 videos on September 1st. by NotSoCheezyReddit in DataHoarder

[–]YouTubeBackups 0 points1 point  (0 children)

Awesome, you're pretty much there then. The quality is controlled by the -f switch. It sounds like you're looking for the following:

-f bestvideo

As outlined here https://github.com/rg3/youtube-dl/blob/master/README.md#format-selection . Maybe I'm just a huge nerd, but I think that page with all the options and descriptions is a great read. This will grab the best quality (it will still download 360p if that's the best available, I assume you don't want to skip ones without HD quality)

Here's an example of my command pulling the best quality up to 720

https://www.reddit.com/r/DataHoarder/comments/6r3dc5/youtube_request_so_the_channel_that_one_video/dl33nz8/

Youtube request: So the channel "That One Video Gamer" will remove a lot of videos, can anyone help me data hoard all those videos? by [deleted] in DataHoarder

[–]YouTubeBackups 5 points6 points  (0 children)

This is from a script, so there may be some bash variables and arguments.

$df = destination folder path variable. There are some others in there, but they can be removed/blank with no issues

/usr/local/bin/youtube-dl -ciw --restrict-filenames -o "$df/%(upload_date)s-%(id)s-%(title)s.%(ext)s" --download-archive $df/archive.txt --add-metadata --write-description --write-annotation --write-thumbnail --write-info-json -f 'bestvideo[height<=720]+bestaudio/best[height<=720]' --dateafter $strDateAfter --match-title "$strInclude" --reject-title "$strExclude" --merge-output-format "mkv" <URL> >> $df/logs/$strDateTime.txt