I don't have ARM devices like Surface RT available in my country now, Is there any way to test my app on ARM Simulator from Visual Studio.
As well said in the comment above, there is no ARM simulator available.
That said, an ARM device functions pretty much like a x86 PC. You just have to test all the resolutions and pixel densities available in the simulator.
And beware of the fact that a Surface RT can be more that 10 times slower than a typical PC. So if a computation takes half a second on your PC, it may take far too long on a Surface.
Related
Hardware acceleration is a feature supported by Direct2D. Here is my question.
As far as I know, Hardware acceleration is limited by GPU model, driver version .etc. Does anybody know the details of this? In other words, how to determine whether a computer supports d2d hardware acceleration.
The image below is captured in Chrome browser.
I think this entirely depends on whether your hardware is capable of doing Direct3D 10/11. If you're able to create ID3D10Device/ID3D11Device explicitly specifying hardware mode, then Direct2D should also work in hardware. Note that it's a bit more complicated on Direct2D side, because some render target types do not work with hardware mode, and you also can specify render target options to explicitly ask for hardware mode, using D2D1_RENDER_TARGET_TYPE_HARDWARE.
You can check the DirectX capabilities of any Windows-based PC by running DXDiag.exe and clicking on the "Display" tab.
In 2018 you would be hard-pressed to find a PC that didn't support hardware acceleration. Most systems these days support DirectX 11 at the least and this has more or less been the case since Windows 7.
Unless you're targeting Windows XP or Vista this simply isn't an issue.
For a milling machine I need to connect magnetic linear sensors which output a quadrature signal in order to read the position of 4 axes. The professional digital readouts are rather expensive. After some searching I tried to use an Arduino board and Yuri's digital readout Android software. I kept having issues with the bluetooth connection between my tablet and the Arduino failing.
I've since settled on a four axis serial to USB box, sold by a company which is involved the precision measurement industry. Now my issue is that the software supplied by the vendor for the converter box is only offered for Microsoft Windows. I'd like to run a Raspberry Pi 3 as my readout instead of dedicating a laptop.
I'm reading that running x86 on ARM is possible via QEMU or a paid software option. I'd rather not have to run WINE or a full Windows installation. I think it's possible to make this work in Linux on a Pi3 with a little bit of code. Where I think I will have the most trouble is that the Windows software requires some manner of special code to 'authenticate' the USB box.
Internally the box has a PIC18F45K22 8-bit RISC chip and a MC74HC86A XOR chip. I suspect the latter is used to combine the signals of the four axes before output. The USB to serial chip is a common FTDI FT232RL which I can see connects as ttyUSB0. Running 'screen' against that device has produced no output.
The specification of the microcontroller indicates that it is re-programmable with 1024 bytes of EEPROM. Among the other code they've flashed to the chip, would they have programmed in the 'authentication' code mentioned earlier? Short of de-compiling the Windows program, can I interrogate the device across USB without ruining it?
The microcontroller manufacturer site seems to have reasonable documentation and even code samples. Assuming the flashed code isn't obfuscated, could I download the contents of the EEPROM? From there I suspect I could see what commands are required to initialize the box. I could either remove the 'authentication' or re-write the program in its' entirety and flash it.
I am putting together a very simple device for developing C++ applications on the go. Something like a a psp, size wise, with a small keyboard to write code and LCD screen for the output. This device has also a DVI out, so I can output to screen too.
Now, I am looking for a very lightweight C++ IDE for X11 (the device runs on a Linux variant of OpenEmbedded); something that is able to work decently on both the DVI monitor but mainly on the small lcd screen (it is a 4.3 inch screen with a 480x272 resolution).
I am aware that anything under 1024x768 is just pure joy for pain lovers, but the device will be used only in particular situations; when a laptop/netbook is not feasible.
While using the DVI out, I have no problems running Gnome, the screen is big enough to allow me to work comfortably. the problem is that it won't scale very well on the 480x272 screen.
The first idea was to use nano and g++, without even loading X11, but I need to have something with code completion and a minimal UI to do the standard operations (build, run, debugging step by step, breakpoints), if possible. The memory is not that big (256 Mb), so the smaller the IDE, the better it is.
What would you suggest, other than the approach via text editor and g++ ? I am new to the embedded linux world, I use Eclipse on Ubuntu, but that one is incredibly huge and on such small device would just kill it. On Windows I would use Dev-C++, while on Mac I use either code:blocks or Xcode, but I don't really need all these features for the device.
Thanks in advance
I had the same problem, developing on an ODroid (http://hardkernel.com) running Linux on an ARM processor. For a while I used Netbeans, but it was too heavy, so I finally gave up and wrote my own open source IDE for the purpose.
https://github.com/amirgeva/coide
I tried using "Kinect for Windows" on my Mac. Environment set-up seems to have gone well, but something seems being wrong. When I start some samples such as
OpenNI-Bin-Dev-MacOSX-v1.5.4.0/Samples/Bin/x64-Release/Sample-NiSimpleViewer
or others, the sample application start and seems working quite well at the beginning but after a few seconds (10 to 20 seconds), the move seen in screen of the application halts and never work again. It seems that the application get to be unable to fetch data from Kinect from certain point where some seconds passed.
I don't know whether the libraries or their dependency, or Kinect's hardware itself is going wrong (as for hardware, invisibly broken or something), and I really want to know how to detect which is it.
Could anybody tell me how can I fix the issue please?
My environment is shown below:
Mac OS X v10.7.4 (MacBook Air, core i5 1.6Ghz, 4GB of memory)
Xcode 4.4.1
Kinect for Windows
OpenNI-Bin-Dev-MacOSX-v1.5.4.0
Sensor-Bin-MacOSX-v5.1.2.1
I followed instruction here about libusb: http://openkinect.org/wiki/Getting_Started#Homebrew
and when I try using libfreenect(I know it's separate from OpenNI+SensorKinect), its sample applications say "Number of devices found: 0", which makes no sense to me since I certainly connected my Kinect to MBA...)
Unless you're booting to Windows forget about Kinect for Windows.
Regarding libfreenect and OpenNI in most cases you'll use one or the other, so think of what functionalities you need.
If it's basic RGB+Depth image (and possibly motor and accelerometer ) access libfreenect is your choice.
If you need RGB+Depth image and skeleton tracking and (hand) gestures (but no motor, accelerometer access) use OpenNI. Note that if you use the unstable(dev) versions, you should use Avin's SensorKinect Driver.
Easiest thing to do a nice clean install of OpenNI.
Also, if it helps, you can a creative coding framework like Processing or OpenFrameworks.
For Processing I recommend SimpleOpenNI
For OpenFrameworks you can use ofxKinect which ties to libfreenect or ofxOpenNI. Download the OpenFrameworks packaged on the FutureTheatre Kinect Workshop wiki as it includes both addons and some really nice examples.
When you are connecting the Kinect device to the machine, have you provided external power to it? The device will appear connected to a computer by USB only power but will not be able to tranfer data as it needs the external power supply.
Also what Kinect sensor are you using? If it is a new Kinect device (designed for Windows) they may have a different device signature which may cause the OpenNI drivers to play-up. I'm not a 100% on this one, but I've only ever tried OpenNI with an XBox 360 sensor.
All,
If I were to develop a kiosk app using Windows presentation foundation, c# and .net, what hardware requirements would I need. I plan on making it a standalone desktop app. It would contain images, and about 1-2 minutes of video. What kind of CPU (pentium, dual-core, what clock speed, graphic card?, memory? )
What if I made the kiosk a web app? What hardware requirements would I need?
Thanks,
Rohit
Not the exact answer, but I would go with the cheapest actual PC configuration. Your software will probably run on an actual 300$ budget PC.
If you need more proof and want to go with older PC. Test it on a relative PC, I'm sure you could find someone with a P4. ;-) Else just buy one used (probably for 50$).