What's new
PGBlitz.com

Register Now! Find useful tips, Interact /w Community Members and join the part the Best Community on the Internet!

Discussion High Level understanding of PGBlitz using VPS and Cloud.

mcleod

Junior Member
Hello Guys,

I am really impressed with the neatness of this project.

New user here - I am versed on Linux/Unix/Usenet but I am struggling understanding the different functions/blocks that PGBlitz relies on.
I understand that there is a feeder function, that takes care of downloading/unparing/unpacking the medias. There is also the Plex media server (or emby) that transcode and serves the medias.
What is the function of Google Drive ? To host PMS medias ? But then I am confused because PGVault is used to sync GDrive to the PMS, then double counting on the size ?
Concerning the feeders (NZBGet and *arrs), I understand that PGClone copies the prepared media files to GDrive - Does PGClone takes care of deleting the copied files from the feeder box?

Sorry a lot of questions I hope to get a better high-level understanding.

So far my idea is to host :
- a GCE Feeder [2cores/2GB] (no NVMe initially as I am not sure my requirements is to watch a media few minutes after a download).
- A GDrive (business $12/mo) to host the media files
- a Vultr VPS to host PMS and Ombi (2c/2G/60GB - $7/mo)

Just a family setup with like 3/4 users watching 4 to 4 shows a week (in the US).
Thanks!
McLeod
 
M

MrDoob

Guest
So First

2 core 2 GB RAM is to low

And the next is GCE as Plex Server is a nightmare.

FYI

Minimum System SpecsRecommend System Specs
RAM4GB minimum8GB+recommended
STORAGE80GBminimum500GB+recommended

Next is G/TDrive is only a storage and he dont host a PMS.

You need a VPS or dedicated Server.

The Free Credit runs so fast, you only have 300$ free-credits and after this one you must pay to much.

Please look the videos again and read the wiki before you start with *PGBlitz*

Left side Wiki and left side YouTube.
 

captngimpy

Senior Member
Staff
To elaborate a little more PG Vault is more of a function for backing up and restoring your setting and configurations for all of the programs to your G/TDrive for save keeping. Google Drive or the Team Drive (i thinik they just renamed these to Team Shares don't quote me though) is used for all of your storage. PMS will read the file stored on the G/T Drive and serve it out to users. PG Clone its the transport system from your server (where ever you are downloading to) and G/TDrive it basically is what moves your files from local storage to the cloud. As for The *arrs and nzbget, the *rarrs keep track of your media and when a new file comes available sends the request to nzbget to download. Once nzbget has that request it downloads unpacks and checks and repairs if needed. Once it has complete this is sends report back to the *arrs telling them to check for the new file. Now if you set everything up correctly the *arrs will look for the new file and mark it as downloaded and if you said up the connection it will trigger Plex to start scanning for new media.
 

mcleod

Junior Member
Thanks for this info!

Three questions then:

1. Does PGClone takes care of deleting the transfered files from the Feeder (nzbget/*arrs) server to G/TDrive ? Or do I need a cron or something for that?
2. How does the PMS "mounts" the G/TDrive (API Direct calls?) ?
3. I'll use a VPS for the PMS (Vultr or Ramnode - 2c/2g/2TB bandwidth for $7/mo) The PMS server then needs a lot of bandwidth to download from the G/TDrive and updload to the Plex client right?

McLeod

How the PMS
To elaborate a little more PG Vault is more of a function for backing up and restoring your setting and configurations for all of the programs to your G/TDrive for save keeping. Google Drive or the Team Drive (i thinik they just renamed these to Team Shares don't quote me though) is used for all of your storage. PMS will read the file stored on the G/T Drive and serve it out to users. PG Clone its the transport system from your server (where ever you are downloading to) and G/TDrive it basically is what moves your files from local storage to the cloud. As for The *arrs and nzbget, the *rarrs keep track of your media and when a new file comes available sends the request to nzbget to download. Once nzbget has that request it downloads unpacks and checks and repairs if needed. Once it has complete this is sends report back to the *arrs telling them to check for the new file. Now if you set everything up correctly the *arrs will look for the new file and mark it as downloaded and if you said up the connection it will trigger Plex to start scanning for new media.
!
 

captngimpy

Senior Member
Staff
1.PG Clone should clear out the files.

2.Mounting is done through Mergefs now I believe (just call it magic)

3.You will use up a fair amount of bandwidth. If you are using the same server for downloading and then uploading to the cloud you will use bandwidth to originally downloaded, then uploaded, and then every time the file is streamed. Also, as @MrDoob said there are minimum requirements and 2c/2GB/60GB won't cut it you might be able to do one stream or 2 max especially if the streams are transcoded.
 

mcleod

Junior Member
If you are using the same server for downloading and then uploading to the cloud you will use bandwidth to originally downloaded, then uploaded, and then every time the file is streamed.
I am thinking of using GCI for the Feeders (download), PGClone to G/TDrive, and Mergefs (thanks - same as unionfs I guess) to mount to another VPS running PMS (and Ombi). In this case, the VPS will download from G/TDrive and upload to the Plex client to stream. Is that what most people using cloud do?
 

captngimpy

Senior Member
Staff
Currently I have a GCE setup that has Nzbget, Sonarr, Radarr, and Lidarr. When you setup PGClone either move or blitz( blitz is if you are pushing more then 750GB to the cloud per day and gets to be a little more complicated if you don't know what you are doing) it will setup mergefs and all your mounts for you so no need to manually doing this.

GCE - *arrs, nzbget, and pgclone

VPS- Plex, ombi, and pgclone

That will get you a basic setup. Keep an eye on your gce you only have 300 free credits once you burn those you will start getting charged. Typically what i do is run through 290 then a force a backup through pgvault to save my *arrs and nzbget settings then. Then i close the billing account in the billing section of the gce portal and open a new gmail account and set everything up again and do a restore and have everything backup and running and run through another 300 credits.
 

Nigel

Senior Member
Staff
Donor
I have the auto scan setting in Plex when a new movie/episode is uploaded to scan that. But If i have plex on my vps and sonarr/radarr gce the auto scan option doesn't work.
 

mcleod

Junior Member
VPS- Plex, ombi, and pgclone
Sorry I am still not clear on the Plex VPS side - does it use PGClone (mergefs) to *download* all the movies from GDrive to the VPS (then this does add to the 750G limit of pgclone)? Or is it on demand - aka when someone start streaming a movie, it does start the download process from GDrive to the VPS?
 

Cringely

Junior Member
It would be on-demand. rclone mounts the gdrive storage similar to NFS or CIFS/UNC, that is similar to connecting an external hard drive but instead of USB it's just over the internet. The storage will look and act like it's directly mounted to the server, though because it's over IP the latency is poor. So lots of random reads/writes won't work well, but opening streams (continuous reads) will work fine after a second or so.

Also for reference I'm running Plex and pgblitz on 2vcpu/2gb ram without any issues at Digital Ocean.
 
Last edited:

captngimpy

Senior Member
Staff
I have the auto scan setting in Plex when a new movie/episode is uploaded to scan that. But If i have plex on my vps and sonarr/radarr gce the auto scan option doesn't work.
Don't set it in plex. Go to the connect settings in your *arrs and set them to connect to the Plex server there is a section of the wiki that talks about this that way the *arrs will force Plex to scan.

Sorry I am still not clear on the Plex VPS side - does it use PGClone (mergefs) to *download* all the movies from GDrive to the VPS (then this does add to the 750G limit of pgclone)? Or is it on demand - aka when someone start streaming a movie, it does start the download process from GDrive to the VPS?
As for the limit of 750GB that is how much you can upload to gdrive in 1 day you can download/stream without limit. PGClone is just a way of mouting G/TDrive so that that your server can access files stored there. Not entirely sure why clone was picked as its name PGMount would be easier to explain even though the main intention of it is to move the files from local storage to the cloud.
 
Last edited:

mcleod

Junior Member
As for the limit of 750GB that is how much you can upload to gdrive in 1 day you can download/stream without limit. PGClone is just a way of mouting G/TDrive so that that your server can access files stored there. Not entirely sure why clone was picked as its name PGMount would be easier to explain even though the main intention of it is to move the files from local storage to the cloud.
Okay! - so PGClone has those two features :
1. When install on the Feeders (GCE) it *replicates*/uploads the files to GDrive and delete them from the Feeder (nzbget and *arrs)
2. When install on the PMS (VPS), it *mounts* GDrive to the PMS. (and I get that here it is a download so there's no 750g limit - thanks).

I got it right ?
 

captngimpy

Senior Member
Staff
Okay! - so PGClone has those two features :
1. When install on the Feeders (GCE) it *replicates*/uploads the files to GDrive and delete them from the Feeder (nzbget and *arrs)
2. When install on the PMS (VPS), it *mounts* GDrive to the PMS. (and I get that here it is a download so there's no 750g limit - thanks).

I got it right ?
Basically.
 

Nigel

Senior Member
Staff
Donor
Don't set it in plex. Go to the connect settings in your *arrs and set them to connect to the Plex server there is a section of the wiki that talks about this that way the *arrs will force Plex to scan.
Tried that. But because It has to upload first to GDrive this doesn't automatically scan my plex server. Because my Plex is on my VPS and Sonarr/Radarr on GCE.
 

captngimpy

Senior Member
Staff
Tried that. But because It has to upload first to GDrive this doesn't automatically scan my plex server. Because my Plex is on my VPS and Sonarr/Radarr on GCE.
Not sure what the issue is then because I have the *arrs on gce and Plex on a separate server and it works perfectly. When ever the *arrs import a file they tell Plex to start scanning.
 

Attachments

Nigel

Senior Member
Staff
Donor
Not sure what the issue is then because I have the *arrs on gce and Plex on a separate server and it works perfectly. When ever the *arrs import a file they tell Plex to start scanning.
Okay. That is strange? Isn't the problem that Plex will scan but the episode is still uploading to gdrive? So it cant find it yet.
 

captngimpy

Senior Member
Staff
Okay. That is strange? Isn't the problem that Plex will scan but the episode is still uploading to gdrive? So it cant find it yet.
How long is it taking to upload if you are running a feeder unless you are uploading huge files it shouldn't take long it all. Once in a while I'll have Plex miss something but as soon as the next item of the same type downloads it picks it up on the next scan.
 

Create an account or login to comment

You must be a member in order to leave a comment

Create account

Create an account on our community. It's easy!

Log in

Already have an account? Log in here.


Top NZB NewsGroups!

Members - Up To a 58% Discount!

Development Donations

 

Online statistics

Members online
4
Guests online
117
Total visitors
121
Top