How to convert old CDC mainframe PRUs to bytes? [closed] - size

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
On the old CDC 6600 running the Kronos operating system developed by Seymour Cray was a I believe a 60-bit mainframe. It referred to units of storage as PRUs. What was a PRU and how can it be converted to bytes? I read a disk storage device held, for example 200,000 PRUs in the late 1970s. I'm curious to find out what size this is in modern times.

Quoted directly from Wikipedia:
The central processor had 60-bit words, whilst the peripheral processors had 12-bit words. CDC used the term "byte" to refer to 12-bit entities used by peripheral processors
Assuming a PRU is one "byte" that would yield (200,000 * 12 / 8) = 300,000 8 bit bytes of storage. This seems a bit "small", even for the day.
According to the description of a disk mass storage unit pictured in the CDC 6400 6500 66000 Reference Manual, it held 500 million bits of data (or about 60MB of 8 bit storage / 40MB 12 bit storage). This was a very large device for the time. I remember working of a VAX 11/70 (super mini) in the early 80's that had three whopping 67MB drives - thought I had died and gone to heaven.
This does not answer what a PRU is but does shed some light on the size of mass storage devices used on "super computers" in the 70's

Related

Does ECC RAM help prevent buffer overflow attacks [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
Does having ECC RAM prevent against buffer overflow attacks? I can't find anything on the web, and am quite curious.
Nope! Not in the slightest.
Ok, let's start with some definitions:
Error correction code memory (ECC memory) is a type of computer data storage that uses an error correction code[a] (ECC) to detect and correct n-bit data corruption which occurs in memory. ECC memory is used in most computers where data corruption cannot be tolerated under any circumstances, like industrial control applications, critical databases, and infrastructural memory caches. (Source: wikipedia)
In information security and programming, a buffer overflow, or buffer overrun, is an anomaly where a program, while writing data to a buffer, overruns the buffer's boundary and overwrites adjacent memory locations. (source: wikipedia
Basically, what ECC is designed to do is fix corruption in memory from, for example, cosmic background radiation. It's a hardware-level technique that doesn't know anything about what the memory contains.
Buffer overflows are a software-level technique where you exploit bad code to escape the boundaries of a particular variable and jump into other parts of memory. You're still writing valid bytes, they're just not where the program expects them to be.

Which RAM type should I get for my laptop? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
My Laptop is Sony Vaio model no SVE1511AENB and my RAM is of 2gb and its type is "1Rx8 PC3-12800S-11-11-B2". I want to exand my RAM so which RAM should I buy for laptop?
Definitely off topic but I feel compelled to answer anyway. Stack Overflow is for programming questions. Please use Super User or a hardware forum in future.
The link below says your laptop supports upto 8GB DDR3 1333MHz(PC3-10600) 204-pin SODIMM RAM so get 2x 4GB sticks. As long as you search for "DDR3 1333Mhz 204" or "DDR3 PC3-10600 204" RAM, it's then primary based on price. Some sites even say it supports 1600Mhz (PC3-12800) and you have that in your system currently.
I put in "ddr3 pc3-10600 1333mhz 204-pin sodimm 4gb" into eBay and plenty came up.
http://tech.firstpost.com/product/laptops/vaio-sve1511aenb-specification-293362.html

Create a vCPU that consists multible CPUs [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
Short question and hopefully a positive answer:
Is it possible to create a virtual CPU that consists from multible real cores?
So lets say you have a 4x3.5 GHz CPU, can you create a vCPU that is 1x14GHz ?
Why do it?
If there is a software which is heavily CPU using, but can just use one thread, it would boost up the program.
I am not very advanced with hardware tech, but I guess there is no way to do that.
Thanks.
So lets say you have a 4x3.5 GHz CPU, can you create a vCPU that is 1x14GHz ?
No. As the expression goes -- nine women cannot make a baby in one month.
Each instruction executed by a virtual CPU can potentially be dependent on anything that previously happened on that CPU. There's no way to run an instruction (or a group of instructions) before all of the previous instructions have been completed. That leaves no room for another physical CPU to speed things up.

Why is PDF file size so small? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have a few copies of textbooks this semester on PDF. These are 1000 page computer science textbooks full of graphics. When I downloaded it, it took just a few seconds which was amazing, I thought something had gone wrong. The entire textbook was 9.7 MB. I opened it up and sure enough, the entire textbook was there, all images and everything were loaded instantly (and I have a really terrible internet connection)
I am just wondering what amazing compression technique allows you to store 1000 pages of a textbook in under 10 MB?
Here is a screenshot of the file properties, I am so baffled.
A typical text page is between 3k and 6k tokens. So the text of your 1000 page book would fit in 6MB even without compression.
Normal compression tools can reduce plain ASCII text with something like 60-80%.
So lets say it's 75%, then you need 0.25 x 6MB = 1.5MB for the text. That leaves 8.5 MB for the pictures.
For vector based images like svg that's a lot, they are small and compress as well as text. But 8.5 MB does not leave room for a lot of embedded bitmaps.

History of Embedded Software [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
As far as I understand it, embedded software is just software (that runs on a general purpose CPU) that has little if any user input or configuration. Embedded software powers IP routers, cars, computer mice, etc.
My question is:
When (roughly) was the historical moment when embedded software was first considered cost-effective for some applications (rather than an equal technical solution not involving embedded software)? Which applications and why?
Detail: Obviously there is a tradeoff between the cost of a CPU fast enough to perform X in software versus the cost of designing hardware that performs X.
Embedded systems date from the Apollo moon landings. Specifically the Apollo Guidance Computer (AGC) - widely held to be one of the first examples of embedded systems.
Commercially in the early 1970's early microprocessors were being employed in products, famously the 4-bit Intel 4004 used in the Busicom 141-PF. Bill Gates and Paul Allen saw the potential for embedded microprocessors early with their pre-Microsoft endeavour the Traf-O-Data traffic survey counter.
So I would suggest around 1971/72 at the introduction of the Intel 4004 and the more powerful 8-bit 8008. Note that unlike the more powerful still Intel 8080 which inspired the first home-brew microcomputers and the MITS Altair, the 4004 and 8008 were barely suitable for use a general purpose "computer" as such, and therefore embedded computing systems pre-date general purpose microcomputers.
I would dispute your characterisation of what an embedded system is; if you were asking that question here's my answer to a similar question.