Does IronPython support APIs to receive/send data from peripherals? - api

I will receive a hardware which will have native Python API to communicate it through USB. So the device will send image data to Python. It is because they only support with Python API.
I'm stuck because I need to use C# since it is part of the developed application. So I was thinking to embed that hardware USB data commutation using .NET. But I'm vey shocked when I heard they only have Python API. Then I came around IronPython. But I am not sure in my case it would work since I'm talking about a third party API from a small company.
Can I embed this in C# application and communicate through it? So in my case the Python engine should receive data from USB port and transfer it to C# or send data to USB device from C# to Python and then to the device. Is is possible? If not any alternative exits?

Related

how to generate HID get/set report requests to test a new usb device

I want to test my usb device firmware. I'm looking for a host tool to generate get/set reports to exercise my USB device. Handling INPUT requests and generating OUTPUT requests would be a bonus.
Are there any generic tools to send a USB message? Any python tools to recommend?
It seems like a pretty generic need but I havent found much in my web searches.

Writing an API to Interact with my own custom hardware?

Suppose I've a custom made hardware which is connected to the computer. I've my drivers installed to communicate with the hardware.Then how can I write an API through which i can access my hardware programmatically(As an example consider accessing oculus rift head mounted device using OculusSDK).Specifically how do i make my API communicate with device drivers to access the hardware using system call interface? If possible explain with an example.
Your hardware driver should implement a way of interracting with user.
E.g., it can be special file(s), which user can open (using standard open(2)) and then call read/write/mmap/ioctl for it. Each such operation actually triggers some driver code, which in case triggers some hardware request.

Is it possible to use WebRTC to streaming video from Server to Client?

In WebRTC, I always see the implementation about peer-to-peer and how to get video streaming from one client to another client. How about server-to-client?
Is it possible for WebRTC to streaming video file from server-to-client?
(I am thinking about using WebRTC Native C++ API to create my own server application to connect to the current implementation on chrome or firefox browser client application.)
OK, if it is possible, will it be faster than many current video streaming services?
Yes it is possible as the server can be one of the peers in that peer-to-peer session.
If you respect the protocols and send the video in SRTP packets using VP8, the browser will play it. To help you build these components on other applications or servers, you can check this page and this project as a guide.
Now, comparing WebRTC with other streaming services... It will depend on several variables like the Codec or the protocol. But, for instance, comparing WebRTC (SRTP over UDP with VP8 Codec) against Flash (RTMP over TCP with H264 Codec), I would say that WebRTC wins.
The player will be Flash Player against the native <video> tag.
The transport would be TCP against UDP.
But of course, everything depends on what you are sending to the client.
I have written some apps and plugins using the native WebRTC API, and there isn't a lot of information out there yet, but here are a few useful resources to get you started:
QT Example: http://research.edm.uhasselt.be/jori/qtwebrtc
Native to Browser example: http://sourcey.com/webrtc-native-to-browser-video-streaming-example/
I started with the WebRTC Native C++ to Browser Video Streaming Example but it doesnot build anymore with the actual WebRTC Native Code.
Then I made modifications merging into a standalone process :
management of the peerConnection (the peerconnection_server)
access to Video4Linux capture (the peerconnection_client).
Removing the stream from browser to the WebRTC Native C++ client give a simple solution to access throught a WebRTC browser to a Video4Linux device that is available from GitHub webrtc-streamer.
Live Demo
We are attempting to replace MJPEGs with Webrtc for our server software and have a prototype module for doing this using a smattering of components tied to the Openwebrtc project. It has been an absolute bear to do, and we have frequent ICE negotiation errors (even over a simple LAN), but it mostly works.
We also built a prototype with the Google Webrtc module, but it had many dependencies. I find it easier to work with the Openwebrtc modules because Google's stuff is so tightly tied to general peer-to-peer scenarios on the browser.
I compiled the following from scratch:
libnice 0.1.14
gstreamer-sctp-1.0
usrsctp
Then I have to interact with libnice a bit directly to gather candidates. Also have to write out the SDP files by hand. But the amount of control--being able to control the source of the pipeline--makes it worthwhile. The resulting pipeline (with two clients off one server source) is below:
Of course. I'm writting a program using native WebRTC api which can join the conference as a peer and record both video and audio.
see: How to stream audio from browser to WebRTC native C++ application
and you can definitely streaming media from native app.
I'm sure you can use dummy_audio_file to streaming audio from local file, and you can find a way to access the video streaming progress by your own.
Yes it is. We have developed an load test tool to publish and play for Ant Media Server. This tool can broadcast media file. We used the same native WebRTC library used in Ant Media Server.
Sure it's possible, it allows covert live streaming to WebRTC, for example:
OBS/FFmpeg ---RTMP---> Server ---WebRTC--> Chrome/Client
For this scenario, it allows the ultra low latency live streaming, about 600~800ms, to play the live streaming by WebRTC. Please take a look at this demo.

vb.net windows application barcode scanner

I am creating a windows application which scans barcode and identifies the product. For it, I am planning to use a webcam temporarily. I want to know how can I interface my webcam with the application. The webcam is USB one.
The application is similar to the ones available these days on Smartphones.
Please help.
Thank You!
You can use either the WIA approach or the DirectShow, depending on your web cam driver. In both cases, unless you are very familiar with these 2 approaches, you might want to use a third-party library. We are using a third-party multimedia toolkit named leadtools to control our web cameras. You can check this forum post for further information.
Also, this toolkit supports reading barcodes. For more information, refer to this Tutorial
You need to make use of Windows Image Acquisition (WIA)
Here are several resources which you can use.
How to use a web cam in C# with .NET Framework 4.0 and Microsoft Expression Encoder 4
C# WebCam User Control Source
How do I connect to a USB webcam in .NET?
or DirectShow.Net
Webcam using DirectShow.NET

Programming USB in embedded system for sending some data to host for printing

I have been tasked with writing a USB driver for our embedded software to send raw data to Host. This will be used to send some logging data to host. We are using iMX31 litekit for development.
From the documents that I have read on USB, my understanding is that the embedded device will be in device mode only. Also it will only be communicating with host machine.
So can any one guide me here? Any article, reference or code is welcome.
Some things to consider:
Is this a high bandwidth device like a camera or data recorder, or a low bandwidth device?
For low bandwidth, I would strongly consider making your device act as a USB HID class. This is the device class that supports keyboards, mice, joysticks, gamepads, and the like. It is relatively easy to send data to nearly any application, and it generally doesn't require that you write a custom device driver on the host side. That latter feature alone is often worth the cost of lightly contorting your data into the shape assumed by the HID class. All the desktop operating systems that do USB can use HID devices, so you get broad compatibility fairly easily.
For high bandwidth, you would still be better served if your device fits one of the well established device classes, where a stock device driver on the host end of the wire can be used. One approach that often works is to use the Mass Storage class, and emulate a disk drive containing one file. Then, your device simply mounts on the host as if it were a disk, and you communicate by reading and writing to one (or a few) file.
I would expect there to be a fair amount of sample code out there for any serious USB device chipset that implements either or both of HID and Mass Storage.
If you really must wander into fully custom device territory, then you will need to be building device drivers for each host platform. The open source libusb library can be of some help, if its license is compatible with your project. There are also ways in newer versions of Windows to develop USB drivers that run in user mode using the User Mode Driver Framework that have many of the same advantages of libusb, but are not portable off the Windows platform.
The last custom device I worked on was based on a Cypress device, and we were able to ship their driver and an associated DLL to make our application code easier to build. I don't know off the cuff if there is any equivalent available for your device.
For a really good overview, I recommend the USB FAQ, and the latest edition of Jan's book, USB Complete.