Realtime communication between 2 files - vba

We have "host.xlsb" and "checkin.xlsb" on PC#1.
PC#2 open "checkin" via lan.
In business hour, clients will come and scan their membership ID card's bar code using bar code scanner.
Bar code scanner reads the ID and send to "checkin".
"checkin" checks the ID and display info (eg. which table) to clients, and record the ID and the check in time to a list.
"host" is for reception, pull data from "checkin" to see who has come and who has not and check if clients went to wrong table.
Thus I want "host" could read changes on "checkin" in realtime, possible?
P.S.:
I know I can do it if I simply put "host" and "checkin" in a single workbook and use PC#1 only.
But if I combine them, I will need reception to wait for clients or clients to wait for reception.
Neither I don't want any other PC to open the combined one at the same time.

Make the application at the checkin machine keep the data in a seperate (ASCII) file. Checkin would open the file with write access and update the information. The "host" machine would open the file with read access and check the latest info, then close the file.

Related

WebRTC: Have multiple tracks (or streams) and identify them on the other side

I'm using WebRTC to build a Skype-like application. I want one party to be able to send a feed from their webcam, while sharing their screen at the same time.
On the receiving end, however, I can't find any way to identify what type of stream I'm receiving -- label and ID are reset to a new value (bummer, I was hoping to identify it by its source ID), and I can't find any options for adding my own metadata to the streams or tracks. How does the receiving client know what type of media I'm sending them?
Any ideas? Thanks in advance!
As it turns out, MediaStreamTracks get a new ID assigned on the other side. MediaStreams however, keep their assigned IDs, so use those when doing AddTrack, and then use a DataChannel to send information about the stream based on its ID.

How do I clear up Documentum rendition queue?

We have around 300k items on dmi_queue_item
If I do right click and select "destroy queue item" I see the that row no longer appears if I query by r_object_id.
Would it mean that the file no longer will be processed by the CTS service ? Need to know if this would it be the way to clear up the queue for the rendition process (to convert to PDF) or what it would it be the best way to clear up the queue ?
Also for some items/rows I get this message when doing the right click "destroy" thing, what does it mean ? or how can I avoid it ? Not sure if maybe the item was processed and the row no longer exists or is something else.
dmi_queue_item table is used as queue for all sorts of events at Content Server.
Content Transformation Service is using it to read at least two types of events, afaik.
According to Content Transformation Services, Administration Guide, ver. 7.1, page 18 it reads dm_register_assets and performs the configured content actions for this specific objects.
I was using CTS for generating content renditions for some objects using dm_transcode_content event.
However, be carefull when cleaning up dmi_queue_item since there could be many different event types. It is up to system administrators to keep this queue clean by configuring system components to use events or not to stuff up events that are not supposed to be used.
As per cleaning the queue it is advised to use destroy API command, though you can even try to delete row using DELETE query. Of course try to do this in dev environment first.
You would need to look at 2 queues:
dm_autorender_win31 and dm_mediaserver. In order to delete them you would run a query:
delete dmi_queue_item objects where name = 'dm_mediaserver' or name = 'dm_autorender_win31'

NServiceBus Central Repository

I am currently researching the possibility of using NServiceBus for one of our applications. The current application takes large text files and parses the details into a database. The users perform various operations on the file content, approve the changes, and finally release the updated file. When the file is released, various other services need to do something with that file data (drop file in ftp folder, email customer, bill customer).
From what I read services should be autonomous and not share data, except via messages. I like that concept, however in my case I am wondering if it is practical. Some of these files can contain up to a million records.
So my question is, should each service (operations, billing, emailer) all have their own database and table for storing this file data, and move the data via the DataBus? Or should I be more pragmatic and only send the fileID in the message which references a central file table?
Thanks for any guidance you can offer.
There are a couple of things that one should not do with a service bus:
move masses of data
perform queries
large ETL operations
You are certainly able to do all these things but you will probably be left disappointed. The messaging to enable some of these operations is fine, though. Your idea of sending the FileID is definitely the way to go.
As an example: I have previously implemented an e-mail sending service. This service can send attachments but these can be large. So instead of including the attachments in the messages I stored the attachments on a shared folder and sent a SendEMailCommand message that also included the unique attachment ids that need to be sent with the e-mail. The e-mail service would then pick up the attachments from the shared folder. After the service successfully sent the mail an EMailSentEvent message would be published.

Alternative to wcf callback.Is there one?

Wondering if there is a better option than a wcf callback.
When processing some data Invoices and printing them and I need to constantly show the user in a winform -"Invoice 1 Printed" invoice 2 printed etc....
I have put together a call back mechanism and all works but wondering if there is a better way of doing this .
Was thinking along the line if 2 services would be better than a callback.
One that loops at server side through the invoices and saves to the database the status ="Printed" and the other the queries it and check if it has printed and return to the user
.
Would that be better than a callback,faster and avoid timeouts etc..?
Just thinking as an alternative as a collegue who used callback extensively said" dont use callback use 2 services".
What would you do if you had to process 2000 invoices and notify the user for each one
Any suggestions?
On one project we have done the following:
All windows clients also host a WCF service
When the windows client starts it registers itself with the server, that this user is loggon with this IP address.
The server stores info on who is logged in where
Then we can send a message to the user whenever we want
When the client recieves the message we fire an event, then whatever part of the UI that is affected can update itself or show a message.

HID input report queues on C8051F320

it seems that as soon as data is ready for the host (such as when I use WriteFile to send a command to the HID in which I tell the HID to give back some data such as the port value) and the in packet ready bit is set, the host reads it (as confirmed by another USB interrupt) before ReadFile ever is called. ReadFile is later used to put this data into a buffer on the host. Is this the way it should happen? I would have expected the ReadFile call to cause the in interrupt.
So here is my problem: I have a GUI and HID that work well together. The HID can do I2C to another IC, and the GUI can tell the HID to do I2C just fine. Upon startup, the GUI reads data from the HID and gets a correct value (say, 0x49). Opening a second GUI to the same HID does the same initial data read from the HID and gets the correct value (say, 0x49; it should be the same as the first GUI's read). Now, if I go to the first GUI, and do an I2C read, the readback value is 0x49, which was the value that the 2nd GUI had requested from the HID. It seems that the HID puts this value on the in endpoint for all devices attached to it. Thus the 1st GUI incorrectly thinks that this is the correct value.
Per Jan Axelson's HID FAQ, "every open handle to the HID has its own report queue. Every report a device sends goes into all of the queues so multiple applications can read the same report." I believe this is my problem. How do I purge this and clear the endpoint before the 1st GUI does its request so that the correct value (which the HID does send per the debugger) gets through? I tried HidD_FlushQueue, but it keeps returning False (not working; keep getting "handle is invalid" errors, although the handle is valid per WriteFile/ReadFile success with the same handles). Any ideas?
Thanks!
You might not like this suggestion, but one option would be to only allow one GUI at a time to have an open handle. Use your favorite resource allocation lock mechanism and make the GUIs ask for the HID resource before opening the handle and using it.