Compressing the final output data into gzip format - gzip

I am getting the final output data in C++ as a string.....Need to compress that data in gzip format.Can someone tell me the way about how to implement it?

Use zlib. It's probably already available in your development environment. (Which for some reason you are keeping a secret in your question.)

Related

How to extract the encoding dictionary from gzip archives

I am looking for a method whereby I can extract the encoding dictionary made by DEFLATE algorithm from a gzip archive.
I need the LZ77 made pointers from the whole archive which refer to patterns from the file as well as the Huffman tree with the aforementioned pointers.
Is there any solution in python?
Does anyone know the https://github.com/madler/infgen/blob/master/infgen.c which might provide the dictionary?
The "dictionary" used for compression at any point in the input is nothing more than the 32K bytes of uncompressed data that precede that point.
Yes, infgen will disassemble a deflate stream, showing all of the LZ77 references and the derived Huffman codes in a readable form. You could run infgen from Python and interpret the output in Python.
infgen also has a -b option for a non-human-readable binary format that might be faster to process for what you want to do.

How to inject data in a .bin file in a post compilation script?

Purpose
I want my build system to produce one binary file that includes:
The bootloader
The application binary
The application header (for the bootloader)
Here's a small overview of the memory layout (nothing out of the ordinary here)
The build system already concatenates the bootloader and the application in a post-compilation script.
In other words, only the header is missing.
Problem
What's the best way to generate and inject the application header in the memory?
Possible solutions
Create a .bin file just for the header and use cat to inject it in my final binary
Use linker file to hardcode the header (is this possible?)
Use a script to read the final binary and hardcode the header
Other?
What is the best solution for injecting data in memory in a post compilation script?
SRecord is a great tool for doing all kinds of manipulation on binary and other file types used for embedded code images.
In this case, given a binary bootheader.bin to insert at offset 0x8000 in image.bin:
srec_cat bootheader.bin −binary −offset 0x8000 −o image.bin
The tool is somewhat arcane, but the documentaton includes numerous examples covering various common tasks.

Please tell me about the uses or working of cppcheck after including the header files for analysis

Please tell me the differences with/without header files during the cppcheck's analysis.
Actually i am integrating cppcheck's report with sonar, will sonar's dashboard will contain any differences?
After including header files, it took 5 days(approx) complete the analysis, even though i used -j 4 and max-config to 2 options.
And confused that, the LOC has reduced after including header files for analysis. and i could see the functions , classes are reduced to few numbers.
Does cppcheck errors on header files? if yes, what rules are applied on it? where can i find this info, thw rules that are associated with header files?
Please help.
thanks,
Dinesh
I am a Cppcheck developer.
It's not a technically trivial question if you should include headers or not. There are both benefits and drawbacks with headers for the analysis. Better type information is a good thing. Expanding macros might be a bad thing.
In case you wonder; the same checkers will be used no matter if headers are included or not. It's just that the input data is not always better when all headers are included.
I certainly recommend that you don't include any standard headers. stdio,string,stl,etc.
I personally normally don't include various system headers. I would prefer to create a cfg file instead if I use a library. That will give Cppcheck better information about the library than the headers.
I normally try to include local headers in the project. Use -I to add good paths in the project.

Restrict file size using

I have to create a sub routine using VB.Net that compress some files into a "file.zip" file, but the problem is that this "file.zip" MUST have the maximum size of 2 MB.
I don't know how to do it, even if it's possible.
It would be nice if someone has some example to show me.
It is not possible to do this in the general case. For example if you have a 2GB movie file, no lossless compression algorithm will ever get it to 2MB.
One solution is to "chunk" your ZIP file. That is, divide it into parts that are individually no more than 2MB. 7-Zip has support for this. You can use their .NET API from VB.Net. I'm not sure whether the API provides direct support for chunking. If not, you can start 7-Zip from your program using Process.Start().

How can I recover files from a corrupted .tar.gz archive?

I have a large number of files in a .tar.gz archive. Checking the file type with the command
file SMS.tar.gz
gives the response
gzip compressed data - deflate method , max compression
When I try to extract the archive with gunzip, after a delay I receive the message
gunzip: SMS.tar.gz: unexpected end of file
Is there any way to recover even part of the archive?
Recovery is possible but it depends on what caused the corruption.
If the file is just truncated, getting some partial result out is not too hard; just run
gunzip < SMS.tar.gz > SMS.tar.partial
which will give some output despite the error at the end.
If the compressed file has large missing blocks, it's basically hopeless after the bad block.
If the compressed file is systematically corrupted in small ways (e.g. transferring the binary file in ASCII mode, which smashes carriage returns and newlines throughout the file), it is possible to recover but requires quite a bit of custom programming, it's really only worth it if you have absolutely no other recourse (no backups) and the data is worth a lot of effort. (I have done it successfully.) I mentioned this scenario in a previous question.
The answers for .zip files differ somewhat, since zip archives have multiple separately-compressed members, so there's more hope (though most commercial tools are rather bogus, they eliminate warnings by patching CRCs, not by recovering good data). But your question was about a .tar.gz file, which is an archive with one big member.
Are you sure that it is a gzip file? I would first run 'file SMS.tar.gz' to validate that.
Then I would read the The gzip Recovery Toolkit page.
Here is one possible scenario that we encountered. We had a tar.gz file that would not decompress, trying to unzip gave the error:
gzip -d A.tar.gz
gzip: A.tar.gz: invalid compressed data--format violated
I figured out that the file may been originally uploaded over a non binary ftp connection (we don't know for sure).
The solution was relatively simple using the unix dos2unix utility
dos2unix A.tar.gz
dos2unix: converting file A.tar.gz to UNIX format ...
tar -xvf A.tar
file1.txt
file2.txt
....etc.
It worked!
This is one slim possibility, and maybe worth a try - it may help somebody out there.