What's new
PGBlitz.com

Register Now! Find useful tips, Interact /w Community Members and join the part the Best Community on the Internet!

NZBGET Downloads - running out of HDD on 2TB. Is this typical?

timmeh

Blitz 2nd Class
I've been up and running my VPS with GSuite for a week or so now without major issues. :cool:
However, I've started running out of local hard drive space. I've got 2TB (4TB in Raid 5) usable space. Downloads are transferring over to GDrive nicely using RClone. Currently only 2 small files in /mnt/incomplete/nzbget/ and 3 in
/mnt/downloads/nzbget/tv/.

I appreciate that I have a fair few apps installed and running, plus my plex server which will have a lot of metadata to deal with as well, so this could be where a lot of storage is being taken up. To try to sort this out, I recently upped my BW Limit to 10MB in the hope of RCLone transferring files across faster. I also changed my NZBGET settings/download queue/diskspace warning to '0' as I'd read in other posts that this is causes unnecessary download pauses. However my local disk space is getting completely used up.

I suppose my question is, is 2TB an insufficient amount of local HDD space to be constantly downloading loads of data, or is there something else a-miss that I need to fix? And if it's the latter, any ideas what it might be?

Timmeh!
 

Proelior

Blitz 3rd Class
Log into vps and run
ncdu /mnt/downloads
This will show you what's using up space in your downloads folder.
If that sheds no light run
ncdu /
This will take some time to scan as its scanning whole drive and your unionfs folder might be quite large.
 

timmeh

Blitz 2nd Class
Cheers guys. I found that my abandoned downloader had loads of stuff in its incomplete folder - found and deleted. Should have thought of that really... 🥴 Free'd up 1TB of space... however...

...since deleting all that last night, and downloading today, the amount of available space went from 1TB down to 500GB.

Using NCDU on the downloads folder, only 150GB is being used by nzbget. So, 350GB has been used by a mysterious source. I'm concerned that if this dynamic continues, I'll be out of space again by tomorrow... any ideas what might be going on?
 

Proelior

Blitz 3rd Class
Did you try the ncdu / command and let it run its course. It will output space used by every folder on the system.
 

timmeh

Blitz 2nd Class
Thanks Proelior. I'm just running that command now, and I've noticed something as it is searching - it is listing all the files that I have manually added to google drive from my Mac laptop, namely a back up of my laptop and other files such as my existing ebooks library. These are respectively stored in /mnt/unionfs/backup/ and /mnt/unionfs/ebooks/, and appear to be being counted against my VPS's local hard drive space. Is that normal? What do I have to do to have the VPS deal with them in the same way that it does mnt/unionfs/tv etc..., so that it does not count as local hard drive space? Or can I simply not use this GDrive account for anything else?
Thanks for your help.
TIMMEH!
 

PlexFan

Blitz 3rd Class
Staff
So pgblitz does not download to youur computer to take up space. The HD setup is tricky. I go with the cloud to uncomplicate everything. Why it’s taking up vps space is odd.
 

Proelior

Blitz 3rd Class
Your unionfs folder will show all files you have already uploaded to your gdrive or tdrive. It doesnt necessarily mean they are on your local machine taking up space.
Look in /mnt/downloads and /mnt/move and /mnt/incomplete and all subfolders in there. If those folders are empty then you must have stuff downloaded into the wrong folders. If you do find data you need to upload manually just create a folder named whatever you want in /mnt/move and make sure to chown -R 1000:1000 the folder you created.
It will then upload on the next cycle.
 

timmeh

Blitz 2nd Class
Hmmm, OK. I'm totally out of space now despite stopping manual uploads. Those folders you mentioned are full of files. mnt/move/ is taking up 1TB (600 movies, 400 tv). Is it that chown -R 1000:1000 isn't applied properly? How would I check for this? Or is it awaiting an uploads cycle? Or perhaps missed an upload cycle? Thanks or your help @Proelior !
TIMMEH!
 
Last edited:

Proelior

Blitz 3rd Class
Log into your vps and then issue following commands

cd /mnt/move
ls
sudo chown -R 1000:1000 FOLDER YOU CREATED

Replace FOLDER YOU CREATED with the name of the new folder
 

timmeh

Blitz 2nd Class
OK, so I've tried many, many things over the last 2 days and I've wound up losing my /mnt/move data - about 1TB of data. So frustrated with this error. I've reinstalled, I've added commands, I've changed config files, I've changed API keys, I've completely rinsed this forum for advice, but nothing has worked. I'm quickly realising that this setup, unless somehow miraculously executed perfectly, is extremely hard to troubleshoot.
 

Xployt

Blitz Sergeant
Staff
It's not hard to troubleshoot just like everything else it requires some time and effort.

If you read the Wiki it will help you get a better understanding of how things work.

Are you using PGMove or PGBlitz? If you need extra help and don't want to fill up this thread feel free to PM me and I can assist you further.
 

Datamonkeh

Data Hoarding Primate
Staff
Donor
The advice you've been given tells you how to work out exactly where your data is, it's likely to be failed downloads and logs, it's not difficult to spot based on the size of the directories.
 

timmeh

Blitz 2nd Class
The advice you've been given tells you how to work out exactly where your data is, it's likely to be failed downloads and logs, it's not difficult to spot based on the size of the directories.
Cheers man, I've found the files, and I've found that it's an issue with PGMove. Files simply aren't being deleted from /mnt/move/movies after the download completes. Plex picks them up and they appear in mnt/unionfs/movies, but if I delete them from /mnt/move/movies to free up space, then they are also removed from mnt/unionfs/movies, which tells me that they are still being spoofed, and haven't truly transferred. Have rebuilt and followed so many instructions but nothing seems to work... :( ongoing saga.
 

timmeh

Blitz 2nd Class
It's not hard to troubleshoot just like everything else it requires some time and effort.

If you read the Wiki it will help you get a better understanding of how things work.

Are you using PGMove or PGBlitz? If you need extra help and don't want to fill up this thread feel free to PM me and I can assist you further.
Cheers, I'll give you a shout...
 

Datamonkeh

Data Hoarding Primate
Staff
Donor
Then i'd be checking whichever transport method you are using is deployed properly, if it isn't then you'd get exactly what you describe.
 

Admin9705

Administrator
Project Manager
Cheers, I'll give you a shout...
i learned to have a second vm or box. if the issue is the same, then it's a bug. But unless anyone elese reports it, there ar emore variables. PG9 will have a better system to eliminate files; logging, and chmoding files on the go.
 

DrgnFyre

Blitz 1st Class
Another good couple of commands to know is
ncdu -x / mnt
and
ndcu -x /

the '-x' tells it to not follow 'links', so it won't list everything on your G Drive - can be very helpful for seeing stuff stuck in /mnt/move, etc

also be sure to check out the pgui.yourdomain.com interface - it can highlight if anything is wrong with your setup.

Sometimes re-deploying your PGBlitz transport method can work out some kinks as well - it tears down and rebuilds the whole rclone/pgblitz/move subsystem. I've had a couple of cases where nothing was moving like it was supposed to and then doing that cleared it up.

also a good place to look for nzbget 'space' is /opt/appdata/nzbget - I think by default it keeps logs of everything it's ever done, so it's history/logging folder quickly gets out of hand.
 

timmeh

Blitz 2nd Class
Very basis question, but have gone back to basics to try to sort this out. Can I just double check that the 750GB upload limit still stands. And that if I download more than 750GB worth of stuff it would start clogging up internal storage. I'd read that rclone uses service accounts to bypass this but read elsewhere that the 750 limit is enforced... this may be a simple answer to my problems as I'm re-downloading my entire library at the moment at a rate of more than 750GB per day.
 

Proelior

Blitz 3rd Class
Depends on which mount you installed.
Pgmove does have a 750gb limit but upload speeds are capped to ensure you dont go over that.
Pgblitz on the other hand has no such restrictions.
If you need to upload more than 750gb then you must deploy pgblitz.
 

Create an account or login to comment

You must be a member in order to leave a comment

Create account

Create an account on our community. It's easy!

Log in

Already have an account? Log in here.

Similar threads


Development Donations

 

Top NZB NewsGroups!

Members - Up To a 58% Discount!

Trending

Top