Mozilla Thunderbird: recovering emails from global-messages-db.sqlite - sql

How can I recover lost emails from Thunderbird's global-messages-db.sqlite?
My PC kernel panicked in the middle of migrating some emails from Imap to local storage. As a result, that migration did not complete writing to disk.
Due to the failed migration, many emails exist in global-messages-db.sqlite but do not exist in corresponding mbox files (eg not in Mail/username#hostname.tld). I verified that the emails exist in global-messages-db.sqlite via sqlitebrowser. In fact, I've successfully manually extracted the content of some critical messages.
Upon startup, Thunderbird is culling from global-messages-db.sqlite emails it does not find in mbox files. If I replace the active global-messages-db.sqlite with a version containing my lost emails, Thunderbird ignores and eventually removes those lost emails.
This would be a trivial dilemma if I still had access to the Imap account. Unfortunately, I do not.

Scripting a converter is trivial
I would quickly code a gloda->mbox converter, that's probably your fastest path to success. You've already discovered the structure of the gloda database. Now, it's just a matter of writing a little JavaScript in node.js that opens an sqlite database, iterates over the folders, then the messages in each folder, and writes an mbox file for each folder. The mbox file format is trivial (just pay attention for "From"). Once you have that, you can open these mbox files in Thunderbird by just overwriting an existing empty Local Folder.
P.S. Just as a reminder: Make backups. Esp. before such migration operations, but also regularly.

Related

Saving outlook email attachment to disk to enable reading of file content by strings.exe or exiftool.exe safe?

I am using strings.exe https://learn.microsoft.com/en-us/sysinternals/downloads/strings and
exiftool.exe
to try and establish whether the outlook attachments are the file type that they are listed and just haven't changed their extension
Problem is that both of these tool expect a filepath which would force me to use .SaveAsFile(string path) on the mail item attachment object and save to disk to scan them before I delete them
Is this a security risk?
As far as I know I am not executing the file simply reading its metadata and therefore any malicious files should not be executed right?
I am using this wrapper to read the file from c# https://github.com/AerisG222/NExifTool
I have bitdefender installed on the machine and I know I have tried to save a test virus file and it immediately picked it up and deleted it
I have had a look at redemption and it has an option to return AsStream which would allow to use in memory but to scan with exiftools.exe and strings.exe they are both expecting a filepath
hoping to get answer before so-police comes
It will not be a security risk. The file is never executed.
More than that, OOM blocks certain attachment types (such as exe) - they are not even accessible in the MailItem.Attachments collection (unless you are using Redemption of course).

Is it bad practice to have support ticket software create a folder for each submitted ticket?

I wrote a support ticket system. Part of the ticket submission process allows the user to attach files. Once files are attached, the software creates a folder with the ticket ID number as the folder name, and stores the attachments there.
Each time the ticket is loaded, the contents of that folder are displayed in a listbox where they can be modified, added to or deleted, or opened.
The reason I did this was so that the attachment folders were easily accessible, even if the user was not using the software. This way they could send someone who doesnt have access to the software a link to the attachment folder.
The support ticket system is stored in an Access database, and runs out of a stand-alone Windows Form application.
Is this a bad practice? Will this lag the system out eventually? The software will be run off of a share drive, and the folders will be stored there as well.
I do not anticipate more than 60 users ever using this software.
I do not anticipate more than a few dozen requests per week, and requests could be archived once they were resolved.

VbaProject.OTM deployment

I came by this page and was thinking about the best method to distribute my VbaProject.OTM file (located into %appdata%\Microsoft\Outlook\) to a bunch of ~30 users at my office. Is it better to simply copy/paste the OTM file onto the network and then copy/paste it back to all users' computers (manually or with a .bat) OR would it be better to use the method described in the link above to generate a OPS file and import it back with Proflwiz.exe? What's the difference?
We are all on Microsoft Office Outlook 2003 actually, we might upgrade to 2007 one day but still years from now.
Finally came up with some elements to deploy a Outlook VBA Project. There are a lot of ways to do this, but the easiest way to do so without installing anything and keeping the same methodology would be to run a OTM file directly from a server. I found out that the process outlook.exe has a parameter altvba that allows to specify another path to run the OTM file from. Here is en example:
outlook.exe /altvba "\\myServer\myFolder\myFile.otm"
This allows me to update only one file to get all computers updated. Obviously, if the file is big and the server's ping is on the high side, it may delay the launch of Outlook. The other problem with this method is that everybody will have to shut down Office if you want to update the OTM file on the server (and if you do work in an office where everyone uses Outlook, you do know that it is impossible to get everyone to shut it down at the same time, except if you code a macro to do so eventually). To prevent both those problems, I could setup a batch file to copy the server OTM file clientside everytime there is a new version (just have to check the NTFS last-modify attribute). This way, Outlook will boot with a local file, the batch file take 2-3 seconds to copy the file if needed (or will launch Outlook instantaneously) and there will be no problem updating the OTM file on the server. Users will have to start Outlook with the batch file (or with the slightly different outlook.exe path with the altvba parameter, so either way they need a different shortcut/file to start off the first time). One other advantage of the altvba is that it's still easy for the user to run Outlook without it (to see if the VBA is problematic or not in case Outlook is sluggish) and the file will remain unchanged after a Outlook reinitialization.
Others solutions include a COM complement that can be developed in a lot on languages including VB6 (no conversion needed from VBA). There is also a bunch of tools included into Microsoft Office XP Developer that could help getting the job done (not free however, especially if you need the most up-to-date version).

How can I determine if files in a "drop folder" are completely transfered

Remote clients will upload images (and perhaps some instructional files in specially formatted text) to a "drop folder." Once the upload is complete we need to begin processing these images. It would be an easy, but flawed, solution to just have a script automatically begin processing any files in the folder every few seconds (the files can be move out of the folder once processed); but problems would arise when attempting to process large images which are only partially transfered.
What are some tricks I can use to ensure the files are fully uploaded before processing them?
A few of my own thoughts:
The script can check the validity of the file; ie, a partial jpeg would result in an error and you could respond to that error in the script, this would be fairly CPU intensive though. Some files have special markers on the end, but I can't count on this, I'm not sure what formats I'll be dealing with.
I've heard of "file handles" but haven't really figured out the basics of what they are and how I can tell if there is a "file handle" on a particular file. Basically the FTP daemon (actually, I'm on Windows, so "service") would keep a "handle" on the file while it's being uploaded and you would know not to process that file. These are just a few of my thoughts but I'm not really sure if they will work or if there are better or more accepted ways of solving this problem.
If you have an server-side script upload system (PHP, ASP, JSP, whatever), you could instruct the script to call another script to process the files, or to create a flag-file indicating the upload is done, something like this.
If your server is Linux-based, you can use lsof to check if the file is open. As your ftp/script/cgi will close the file after upload completes, lsof will not show the file in the list.
If your server is Windows-based, you can use Process Explorer to list the open files.
By what method are your users uploading the images?

How do i force a file to be deleted? Windows server 2008

On my site a user may upload a file (pic, zip, audio, video, whatever). He then may decide to replace it with a newer revision. This user may upload a file, make a post then decide to put up a new revision replacing the old (lets say its a large zip or tar.gz file). Theres a good chance people may be downloading it if he sent out an email or even im for the home user.
Problem. I need to replace the file and people may be downloading and it may be some minutes before it is deleted. I dont want my code to stall until i cant delete or check every second to see if its unused (especially bad if another user can start and he takes long creating a cycle).
How do i delete the file while users are downloading the file? i dont care if they stop i just care that the file can be replaced and new downloads are the new revision.
What about referencing the files indirectly?
A mapping script, maps a virtual file entry from your site to a real file . If the user wants to upload a new revision of his file you just update the mapping, not the real file.
You can install a daily task that scans all files and deletes all files without a mapping and without open connections.
lajuette's answer is right, the easiest solution is to work around the file locking altogether:
When a user uploads file foo.zip, internally store it as foo-v1.zip.
Create a mapping file somewhere (database, code, whatever) that maps foo.zip to foo-v1.zip.
Rather than exposing a direct link to the file, expose a link to a service that gets the file: mysite.com/Download?foo.zip or something. This service uses the mapping to determine which version of the file to send to the client.
When a new version is uploaded, create foo-v2.zip and update the mapping file.
It wouldn't be that hard to write a scheduled task that cleans up old, un-mapped files.
If your oppose to a database and If the filenames are in a fix format (such as user/id.ext) you could append the id with a revision number and enumerate the folder using a pattern (user/id-*) and use the latest revision.