SpamAssassin creating bayes.toks.expire text files - cpanel

I have a shared hosting account at HostGator and have been using spamassassin for several months with no problem. About 10 days ago, I logged in to cPanel > File Manager > .spamassassin folder, and there were 10-12 text files created like these:
bayes.toks.expire40422 2.99 MB
bayes.toks.expire42356 5.07 MB
bayes.toks.expire44593 5.07 MB
On average, about 2 new files like these are being created each day. I can't open them in File Manager because the file size is too large and if I download a file to my pc and open it with Notepad, there is a lot there, but it is unreadable.
So far, there are also 3 other odd files being created about every 3-4 days like the following:
bayes.locl.gator.hostgator.com.15247 180 bytes
bayes.locl.gator.hostgator.com.28605 210 bytes
bayes.locl.gator.hostgator.com.78666 180 bytes
I searched Google and most posts are pretty old from 2006-2009 and none seem to have a clear answer other than stating these files can be manually deleted. Naturally, I don't want to login every week to manually delete these files so I am trying to find out the cause and a resolution.
I submitted a support ticket to HostGator and their only reply was: 'This is caused due to spamassassin configuration', which does not help.
Also in the .spamassassin folder are these 3 related files:
bayes_journal 28 KB
bayes_seen 324 KB
bayes_toks 5.07 MB
I have the user_prefs file configured and working. Does anyone know the cause of these files or how to prevent them in a shared hosting environment where I do not have direct access to the server?

Related

When to use the Registry

One of my VB.net applications started off to be fairly simple and (surprise) has grown significantly. I was using the Registry to save my limited number of settings, however that has now grown to where I felt I was abusing the Registry. I have converted most of my saved settings to XML files that have been working well. I would appreciate thoughts on saving the following to the Registry, I have been looking through threads on this and still am not sure if I should use files or the Registry:
Paths to the user settings files. Currently the application looks for an
XML file in a specific sub-folder of the users Documents folder for
an XML file containing the paths.
Window positions and sizes for forms (over 50 forms).
Licensing data (licensing per machine).
Application version.

Download large file from FTP server in chunks

I need to download a large file from an FTP server. A new file is uploaded once a week, and I need to be the first to download the file. I've made a check which checks if the file is uploaded, and if the file is there, it will start the download. Problem is this is a big file (3 gb). I can download about 10% of the file within the first few minutes, but as more and more people discover the file is uploaded, the avg download speed drops and drops, to the point where it takes about 3-4 hours to download the remaining 80-90%.
The time isn't a huge problem, but sure would be nice if i could get the download done quicker. The problem is my download never finishes, and i think its because the connection gets timed out.
Solution would be to extend the download timeout, but ideally i have another suggestion. My suggestion is to download the file in chunks: Right now I'm downloading from the beginning to the end in 1 go. It starts of with a good downloadspeed, but as more and more people begin their download, it slows all of us down. I would like to split up the download in smaller chunks and then have all the separate downloads start at the same time. I've made an illustration:
Here i have 8 starting points, which means i'll end up with 8-parts of the zip file, which i then need to recombine to one file once the download has ended. Is this even possible and how would i approach this solution? If i could do this, i would be able complete with the entire download in about 10-15 minutes and I wouldn't have to wait the extra 3-4 hours for the download to fail and then having to restart the download.
Currently i use a web client to download the ftp file, since all other approaches couldn't finish the download, because the file is larger than 2,4 gb.
Private wc As New WebClient()
wc.DownloadFileAsync(New Uri("ftp://user:password#ip/FOLDER/" & FILENAME), downloadPath & FILENAME)

Realm objective c - really huge db file size - 64GB

We have recently planned to switch from SQLite to Realm in macOS and iOS app due to db file corruption issue with SQLite so we first started with macOS app. All coding changes were smooth and app started working fine.
Background about app and DB usage - app really uses DB very heavily and performs too many read and writes to DB in each minute and saves big xml's to it. In each minute it writes / updates around 10-12 records (at max) with xml and reads 25-30records too. After each read it deletes data along with xml from database and my expectation is once data is deleted it should free up space and reduce file size but looks like it is growing continuously.
To test the new DB changes we kept app running app 3-4 days and DB file size went to 64.42GB and app started being slow. Please refer the attached screen shot.
To further debug, I started app with new DB file and size was 4KB but within 5min it goes to 295KB and never reduced in size even records were continuously added and deleted.
To further clarify, app uses NSThreads to perform various operations and those threads writes and reads data to DB but with proper begin\commit transactions. I also read at 'Large File Size' at https://realm.io/docs/java/latest/#faq and tried to find compactRealm but can't find it in objective c.
Can anybody please advise.
Update - I Give up on Realm
After 15days of efforts, Finally I have stopped usage of Realm and starting to fix/workaround db file corruption issue with SQLite. Realm Huge DB file issue was fixed by making changes to code for threads but then I started getting Too many open files error after running the app for 7-8 hours.
Debugged for whole week and made all possible changes and at some point looks like all was good as xcode were not showing any open files. But again I started getting Too many open files crash and then debugged with instrument and found there were so many open files to realm database, lock, commit and cv files.
I am sure there are no leaks in app and xcode also does not shows those open files in Disk usage as well. I decided to invoke lsof command in code before and after Realm calls and most of the it doesn't increase open file count but sometime n between it increases. In My app it started from 120 files to 550 in around 6 hours. Xcode looks all fine via Disk usage but instrument shows open files.
No good support from Realm team, sent email to them, just got one response. Made many changes to code following their suggestions and doesn't work at all so gave up on it. I think it's good for small apps only.

Amazon S3 Different Download Speeds for File Sizes

I'm wanting to know if my theory is true, I have the following files hosted on S3:-
Single 83.9 MB ZIP File
The Single ZIP File separated into 12 files
The Single ZIP File separated into 24 files
I was assuming the single ZIP file would have the best results but this doesn't appear to be the case.
Latest Result
Single: 31 minutes
12 Files: 2.8 minutes
24 Files: 6 minutes
The single file download in particular varies in speeds, I've had results ranging from 15 minutes to 35 minutes for this file.
Question: Does Amazon S3 have different download methods/speeds for different file sizes?
No amazon s3 does not have different method of different count of download. It will be better if you let us know which tool you are using for downloading your data. because if your tool is allowing your to download your data in parallel thread (process) then your part operation will take less time than single file processing time.
Second the download time may vary because of internet speed too.

ASP upload files to server and limit size

I am looking for a free ASP script that will allow me to upload files to my server but limiting the size and type of the files uploaded. It should also inform the user of the errors and not throwing him to IIS error page because of IIS size limits.
I'd really appriciate if there will be an addition that will check the size limit before the file is actually uploaded (meaning - at the browser)
Is there anything like this?
Thanks
Tal
I found this code by a guy named Lewis Moten several years ago:
http://www.planet-source-code.com/vb/scripts/ShowCode.asp?txtCodeId=8525&lngWId=4
It includes checking of file size, though it does happen server-side, not client-side.
I used this code for a project a few years ago and it worked great.