Set a resolution higher than the monitor naturally can display - Screenshots [closed] - screenshot

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm looking for a way to do screenshots in a resolution of 2560x1600 pixel (current game: Skyrim).
My Display supports only resolutions up to Full-HD. Since I can't get higher in the Options-Menue (no possible option above Full-HD for me), I tried to modify the preferences.ini ("SkyrimPrefs.ini").
When I try to start the game, the following error appears:
Failed to initialize renderer.
Your display doesn't support the selected resolution.
Can this be "fixed" in a way? I mean, the screenshot for itself shouldn't care about the resolution of the monitor, it just needs the signal from the graphic chip, if I got this right.

First, you screen cannot give a higher resolution than Full-HD, because it just misses the required amount of pixels.
You want to downsample the game, this means that you render the image, but then it will be downsized to the resolution of your screen.
This is not done in the preferences.ini, but in your graphic chip itself. Furthermore, I don't know if the screenshot depends on the framebuffer or the screen itself. You could try FRAPS, which gets its information from the framebuffer.
I cannot explain this thoroughly myself, but luckily there are other people out there which have done this already.
I don't know what graphic card you have or what your OS is.
A guide for NVIDIA: http://www.neogaf.com/forum/showthread.php?t=509076
A guide for AMD: http://www.neogaf.com/forum/showthread.php?t=472941
I hope this helps you.

Related

Monitor going black for no reason [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 months ago.
Improve this question
Ok so, my monitor is going black on random occasions. Mostly it is when i watch a video. It doesn't matter if it is on youtube, facebook, udemy or whatever other site.
I checked my cables they are all good. I also turned off the screen saver.
Any ideas what it could be?
there could be many reasons for this - bad drivers, bad cables, bad screen, GPU overheating and melting solder connections (I've experienced this).
the easiest thing to check is to see if the issue is in the computer itself. to do that, connect to a different monitor (using a different cable).
to check if it's a software issue, you could try running a LiveCD of a different OS on your computer (for example, Fedora or Ubuntu).
if the issue still happens even with a different OS, then it's likely a GPU problem - you'll need to either get that replaced, replace the mainboard (if GPU is integrated), or replace the computer...

Raspberry Pi 4 doesn't boot when attaching camera [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I am having problem connecting the camera module to my Raspberry Pi 4. The PI is working just fine, but when I attach the camera to the module, it just doesn't boot.
What might be causing this?
So you have a successfully booting system then after physically connecting the camera it will not boot?
First, double check the camera is connected properly. Meaning the blue side of the connections are facing the right way (i.e. blue side facing the USB ports on the RPi itself and facing the front of the camera on the camera module connection). A quick search found this post containing pictures, that is usually the issue. If that fails, consider options within the config.txt file on the /boot partition. Reference for config.txt.
One of the config options that gets added automatically when adding the camera interface via raspi-config is start_x=1 Camera entries within config.txt are described here. Be sure that you have enough memory configured (i.e. gpu_mem=128, though increasing that is probably a good idea if you're doing a lot with the GPU (motion detection, etc.). But the physical connection is most likely the culprit.

Testing JavaFx App For Different Screen Resolution [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
This might seem a bit off-topic but I have a JavaFX app that I need to test(scaling) for different screen resolution. My problem is I have a Windows 8.1 PC running at maximum resolution of 1366x768 but I need to test it for 1920x1080, 1440x900,1600x900 etc. Any idea on how to achieve this will be appreciable.
The obvious solution would be to buy a new graphics and/or monitor.
But if you want to avoid that, you can try testing your app in a virtual machine. I know in VirtualBox, if you set the virtual machine'e resolution higher than the native resolution you will just get scroll bars with the set resolution. So all the scaling should be the same as if you were running it on a higher native resolution.

Windows 8 screen doesn't fit monitor [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I've just installed Windows 8.1 (my friend tells me the PC will start and shut down faster than Windows 7). I encounter some problems: the screen doesn't fit correctly There are black bars at top and bottom of the screen.
My monitor is LG E2211. I tried using the buttons on the monitor but I can't change the original ratio and It says "Digital input No access" when I choose auto.
I found this topic which has similar problem but it's only for windows 7
http://www.tomshardware.com/forum/286677-33-black-bars-5850?
Also, all the games and videos is becoming a lot slower. My friend told me it's because the PC didn't recognize the graphic card. Is it correct?
Go to your official graphic card's manufacturer's website and download the latest driver. I Had the same problem with mine.
Sometimes the native driver struggles to register the correct resolution or place it off screen.

Object Recognition Programmatically? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Inspired by a recent Kickstarter campaign: http://www.kickstarter.com/projects/dominikmazur/camfind-a-mobile-visual-search-app?ref=category
The app uses the mobile camera to take a picture and identify virtually any object. Snapping a photo of a movie poster will recognize the movie and pull up results on the web for you about it, taking a picture of a product will show you websites that product is available for sale on.
My question is, is this realistic? I find it very intriguing, but it object detection really that simple? I'm interested in some feedback regarding resources to help someone get started in learning about this topic.
Computer vision and Pattern Recognition is not easy at all. It's an entire field related to Artificial Intelligence. It is, however, relatively straightforward to understand at a high level though. There is NO WAY they are doing this all on the client. The phones just aren't fast enough, and do not have even close to enough storage space.
What they are most likely doing is sending the image to their servers, then use some kind of nearest neighbour approximation on the image, and run the result through a decision tree look-up in a massive database on images which all have some hash. This will give a close match to an image they have (assuming they have A LOT of images in there database), even if only part of the image matches. Then, using the hash, they look up some other information about that image to send to the device.
Hope that Helps!