Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I was looking to benchmarks and I can't see a difference. OpenGL 4.5 is same with Vulkan. Can an API effect the graphics quality?
It is a bit broad, but it cannot hurt to have The Motivation question answered.
This official video presentation discusses some of the differences: https://www.youtube.com/watch?v=iwKGmm3lw8Q
Vulkan API is a complete rework.
It also gives the programmer more control (but doing so requires him to do more, and know more).
Because of the above any graphics application also requires whole-hearted rework. Otherwise the benefits won't simply manifest. I don't keep updated, but I think big engines like UE4 and Unity still work on a way how to incorporate Vulkan in a non-naive manner.
Some benefits can be seen already in benchmarks. Though not in every benchmark. Some workload is fine for OpenGL and so Vulkan cannot show any improvement. Some aplications perhaps add Vulkan support just as an afterthought, making it unfair comparison. Some Vulkan drivers optimization may not be priority (e.g. for older GPU cards).
Main benefit of Vulkan is on the CPU side. It may manifest in other ways than FPS, such as less ventilator noise (temperature), more battery life and simply having more free CPU for other tasks.
Vulkan also gives more control to the programmer. If exploited it may also translate in other non-FPS benefits, like improving input latency and preventing hitching.
Vulkan also requires less of the driver, hopefully making it easier to optimize and GPU companies more willing to adopt it and implement it even on older cards.
Everything being the same (including the program itself as much as it can be), there should be no overall resulting image quality difference. Pixel values can differ slightly here and there though.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
short question.
What is the meaning of Simulation and Synthesis in VHDL?
What is the difference between Simulation and Synthesis in VHDL?
Yours sincerely
Momo
As you've probably realized by now, VHDL is not a programming language but a Hardware Description Language. It is very easy to get confused about the terminology cause HDL doesn't work like software.
Simulation consists of using a simulator (surprise) such as ModelSim to interpret your VHDL code while stimulating inputs to see what the outputs would look like. The results are typically displayed in a waveform chart, so whenever you see a waveform chart odds are it's about simulation. Simulation takes place in a computer never involving an actual FPGA. Simulation software can be very expensive, I recently came across a free online tool with fair simulation capabilities: EDA Playground
Synthesizing is a completely different thing. Once your design has been proved to work in simulation, the VHDL code goes through a tough process that figures how to implement, simplify, layout and route the actual resources in the FPGA to perform the functions it's supposed to (think of it as the hardware equivalent to compiling). The output of this process is a file that is downloaded to the FPGA.
Hope it helps!
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Till now I used to think that embedded and real time systems are same. But when I was asked in an interview that what the difference is between the two, I was scared. I can't even get proper answer by searching in web.
It was a poor question perhaps since they are not mutually exclusive; an embedded system may be real-time or it may not. One term describes the physical embodiment of a system, the other describes its performance and response characteristics.
Embedded system describes a system that contains one or more software programmable devices but which is not itself a general purpose computer. Such a system typically has a fixed, single application rather than end-user selected and loaded software (which would make it general purpose).
However "embedded" covers a wide spectrum of systems and is not always easy to define; for example if you were writing the UMTS code for a smartphone, you might reasonably be regarded as an embedded developer, if you were writing Flappy Angry Birds 2.0 for that same phone however, you would not - so a smartphone may be both an embedded system and general purpose computer - depending on your view point. Similarly a hand-held games console's system software is embedded; the games themselves are not I would say.
A real-time system describes a system with deterministic low latency response to input events. An embedded system may be "real-time, or it might not. I would normally use the term "real-time embedded system" to be clear.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I am working on a project involving the following with my team:
GUI and a keyboard for user interaction.
Real-time processing and display.
SPI communication.
USB-based printing.
1, 2 and 3 are to be done in parallel.
Currently we are using Raspberry Pi. But R-pi is lagging in doing the job. So any other embedded processor meeting the above specs and should be less than $100.
Any suggestion would be highly appreciated.
PS: Do ask questions if I'm vague in my statements.
Your lack of real-time response probably has more to do with the fact that Linux is not a real-time OS than the performance of the RPi. You can throw processing power at the problem if you like, but it still may not reliable solve your problem.
It is not possible to advise based on the little information you have provided; you'd need to define the real-time response requirements in terms of time and quantity of data to be processed.
While an RTOS might solve your real-time processing problems, that would need you needing drivers for the USB printer, display, and a GUI implementation, these are readily available for Linux, but not so much for a typical low-cost RTOS, especially a USB printer driver, since the raster-image processing required is complex and resource hungry - resources a typical Linux system will have.
If you have the necessary time and skill, you could port RTLinux to RPi (or some other board capable of supporting Linux). It has a different scheduler to the standard time-sharing kernel, and can be used to improve real-time response, but it is no substitute for a real RTOS for deterministic performance.
You may be better off using the RPi and connecting it to a stand-alone microcontroller to perform the hard real-time processing. There are a number of project examples connecting an Arduino to RPi for example. The lower clock rate does not mean slower response since the processor can be dedicated to the task and will not non-deterministically switch to some other task for long periods.
Try the beaglebone black. Its 1GHz processor should be more then sufficient to do your processing. Also it is ARM7, Ubuntu dropped support for ARM6 (Pi) a couple of months ago.
http://beagleboard.org/products/beaglebone%20black
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
From a history of graphics hardware:
Indeed, in the most recent hardware era, hardware makers have added features to GPUs that have somewhat... dubious uses in the field of graphics, but substantial uses in GPGPU tasks.
What is the author referring to here?
I would assume that it is referring to the extra hardware features, as well as abstraction to support GPGPU initiatives such as CUDA and OpenCL.
From the description of CUDA:
CUDA has several advantages over traditional general-purpose
computation on GPUs (GPGPU) using graphics APIs: Scattered reads –
code can read from arbitrary addresses in memory Shared memory – CUDA
exposes a fast shared memory region (up to 48KB per Multi-Processor)
that can be shared amongst threads. This can be used as a user-managed
cache, enabling higher bandwidth than is possible using texture
lookups. Faster downloads and readbacks to and from the GPU Full
support for integer and bitwise operations, including integer texture
lookups
These are all features that are relevant when implement for CUDA and OpenCL, but are somewhat irrelevant (at least directly) to graphics APIs such as OpenGL. GPGPU features still can be leveraged in unconventional ways to supplement the traditional graphics pipeline.
The example of "CUDA exposes a fast shared memory region" would be an additional hardware requirement potentially useless to OpenGL.
You can read this detailed document describing the architecture required for CUDA, and the differences between it and traditional graphics exclusive GPUs.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
As far as I understand it, embedded software is just software (that runs on a general purpose CPU) that has little if any user input or configuration. Embedded software powers IP routers, cars, computer mice, etc.
My question is:
When (roughly) was the historical moment when embedded software was first considered cost-effective for some applications (rather than an equal technical solution not involving embedded software)? Which applications and why?
Detail: Obviously there is a tradeoff between the cost of a CPU fast enough to perform X in software versus the cost of designing hardware that performs X.
Embedded systems date from the Apollo moon landings. Specifically the Apollo Guidance Computer (AGC) - widely held to be one of the first examples of embedded systems.
Commercially in the early 1970's early microprocessors were being employed in products, famously the 4-bit Intel 4004 used in the Busicom 141-PF. Bill Gates and Paul Allen saw the potential for embedded microprocessors early with their pre-Microsoft endeavour the Traf-O-Data traffic survey counter.
So I would suggest around 1971/72 at the introduction of the Intel 4004 and the more powerful 8-bit 8008. Note that unlike the more powerful still Intel 8080 which inspired the first home-brew microcomputers and the MITS Altair, the 4004 and 8008 were barely suitable for use a general purpose "computer" as such, and therefore embedded computing systems pre-date general purpose microcomputers.
I would dispute your characterisation of what an embedded system is; if you were asking that question here's my answer to a similar question.