Does ECC RAM help prevent buffer overflow attacks [closed] - ram

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
Does having ECC RAM prevent against buffer overflow attacks? I can't find anything on the web, and am quite curious.

Nope! Not in the slightest.
Ok, let's start with some definitions:
Error correction code memory (ECC memory) is a type of computer data storage that uses an error correction code[a] (ECC) to detect and correct n-bit data corruption which occurs in memory. ECC memory is used in most computers where data corruption cannot be tolerated under any circumstances, like industrial control applications, critical databases, and infrastructural memory caches. (Source: wikipedia)
In information security and programming, a buffer overflow, or buffer overrun, is an anomaly where a program, while writing data to a buffer, overruns the buffer's boundary and overwrites adjacent memory locations. (source: wikipedia
Basically, what ECC is designed to do is fix corruption in memory from, for example, cosmic background radiation. It's a hardware-level technique that doesn't know anything about what the memory contains.
Buffer overflows are a software-level technique where you exploit bad code to escape the boundaries of a particular variable and jump into other parts of memory. You're still writing valid bytes, they're just not where the program expects them to be.

Related

Create a vCPU that consists multible CPUs [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
Short question and hopefully a positive answer:
Is it possible to create a virtual CPU that consists from multible real cores?
So lets say you have a 4x3.5 GHz CPU, can you create a vCPU that is 1x14GHz ?
Why do it?
If there is a software which is heavily CPU using, but can just use one thread, it would boost up the program.
I am not very advanced with hardware tech, but I guess there is no way to do that.
Thanks.
So lets say you have a 4x3.5 GHz CPU, can you create a vCPU that is 1x14GHz ?
No. As the expression goes -- nine women cannot make a baby in one month.
Each instruction executed by a virtual CPU can potentially be dependent on anything that previously happened on that CPU. There's no way to run an instruction (or a group of instructions) before all of the previous instructions have been completed. That leaves no room for another physical CPU to speed things up.

How to convert old CDC mainframe PRUs to bytes? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
On the old CDC 6600 running the Kronos operating system developed by Seymour Cray was a I believe a 60-bit mainframe. It referred to units of storage as PRUs. What was a PRU and how can it be converted to bytes? I read a disk storage device held, for example 200,000 PRUs in the late 1970s. I'm curious to find out what size this is in modern times.
Quoted directly from Wikipedia:
The central processor had 60-bit words, whilst the peripheral processors had 12-bit words. CDC used the term "byte" to refer to 12-bit entities used by peripheral processors
Assuming a PRU is one "byte" that would yield (200,000 * 12 / 8) = 300,000 8 bit bytes of storage. This seems a bit "small", even for the day.
According to the description of a disk mass storage unit pictured in the CDC 6400 6500 66000 Reference Manual, it held 500 million bits of data (or about 60MB of 8 bit storage / 40MB 12 bit storage). This was a very large device for the time. I remember working of a VAX 11/70 (super mini) in the early 80's that had three whopping 67MB drives - thought I had died and gone to heaven.
This does not answer what a PRU is but does shed some light on the size of mass storage devices used on "super computers" in the 70's

What's the side effects of constantly ending processes through ctrl + alt + del? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I always feel guilty when I end a process in the task manager, thinking to myself that if there was a microsoft developer behind me he'd probably say 'if you only knew what you're doing..' . So, is it that bad to kill processes or actually it's something that don't have relevant collateral damages at all? Thanks!
It depends on what you're killing, really.
If that process has created some temp files, they aren't going to get cleaned up. If the process was in the middle of writing to a file, the file will be incomplete.
I wouldn't worry about the side effects from killing a frozen notepad.exe. But if it is something like VMWare Workstation, then yes, I would worry because my VM might be corrupted.
I'd have to second the answer of 'it depends'. A large percentage of programs out there won't cost you much more than whatever you were doing at the time of the kill. With that said, there are other programs that could suffer significant damage depending on when you kill it. It all depends on what the application is doing, what kind of temp/state/etc files/data it's using at the time, etc.
With all that said, I wouldn't think 'death by task manager' would be all that common of an activity. Sure there are programs that go south off and on, but I'm rarely pushed to having to kill a process of with that kind of force...

Is it possible to measure computer components power consumption in software? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I was wandering if (most common) motherboards provide hardware capabilities for measuring the exact power, expanded by the individual components - CPU, RAM, WiFi, etc.
As voltages are read and directly available e.g. in BIOS, I reckon that similar interface may be provided for the power consumption as well.
Searching Google for fedora18 power optimization pops up a nice guide. Inside, PowerTOP is recommended and it's use is described. The program "is a software utility designed to measure, explain and minimise a computer's electrical power consumption", and this is as close as it gets to answering my question. Voila.

Are there Ciphers that get smaller? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
This question appears to be off-topic because it lacks sufficient information to diagnose the problem. Describe your problem in more detail or include a minimal example in the question itself.
This question does not appear to be about programming within the scope defined in the help center.
Improve this question
I'm playing around with text transformations - ciphers. From all that I have surveyed it seems that all of these algorithms either break even in terms of transformed message length, or get larger. Are there any known algorithms/text transformations that when applied to a message actually make the message smaller (not counting the key, of course)?
For instance, RSA, when you encode the message, makes the encrypted message quite a bit larger than the original. Is there any such thing as that only the message becomes smaller, instead of larger, after (encryption, transformation, etc whatever you want to call it)?
I'm not doing this as part of security, so whether or not it's hackable is not of any interest to me.
P.S. I've done a lot of research in this area already through search engines (google, wikipedia, etc) but I have found no results. I don't want to say that such a technique doesn't exist without at least posting the question publicly first.
Thanks!
Compression tries to make input smaller. Obviously lossless compression will not make every input smaller, since that's impossible.
You can encrypt the compressed input if you want that. In principle compression and encryption are orthogonal concepts, but in some situations the length of the compressed text can be used to attack the system.
At first I thought about language transformation. Some English phrases translate to a single Chinese symbol. That's not a rigorous, mathematical example, but I suppose it qualifies.
Alternatively, from a bit-wise perspective, it wouldn't be possible to cipher/encode 2 bits of information in 1 bit.