In an electric company where I was hired temporarily, we have to implement an upgrade of the billing and payments system ( the current system is a dbaseIII system). The company's programmer and I have decided to use VB.Net and MySQL.
The company served several towns and have billing and payments centers in selected towns. Every billing period, the meter readers would read the readings for every electric meters and then write the readings in the sheet. Every 5 pm, an employee from the centers would collect the sheets and then travel to the main center where the readings are encoded.
The billings are printed in the main center, and then distributed to the branches.
During discussions with General Manager and heads of the company, the two of us are tasked to take advantage of the internet because those towns where the centers are located have internet connectivity, and for those none, we can use the mobile internet.
The new system will allow users to enter the readings, and then send the data to the main server in the main branch. They also have the ability to download and print the billings.
Our problem now is what type of system we have to implement. Should it be web based or a desktop application that will connect to our database server through vpn.
If this is a fixed price project, and the client will accept either web or desktop, go with desktop over VPN. You'll save A TON of time, and have something that is more responsive (from a user perspective).
However, if you think the client will eventually need to use the product on mobile devices or the web, you're shooting yourself in the foot by going winforms.
Having had some experience with using a thick client through VPN, I'd say go with some kind of web app.
If done wrong, a thick client can become really painful to use through a VPN because of data churning. A web app concentrates all of that on the server, which makes it much better from that point of view.
Other benefits:
no deployment hassle
no direct access to the database from the user machine.
Evidently it also depends on your skills, and on how much time/budget you have...
I do not know the situation of the client... but what about giving them the best of both worlds? Considering it sounds like you will be programming on a windows based system, and have deployment access to windows server based hardware, why not either build a Silverlight application, or build a WPF application that's hosted in an IE window? That could give you the best of both worlds?
I think that the answer depends on the type / frequency of database queries you need to make. Querying a DB from a thick client through VPN can be SLOOOOOWWWWWW. In a web app, the application logic runs close to the DB, maybe even on the same machine, so DB queries are fast. The downside is that UI can be slower. But it is probably easier to design a responsive web-based UI than make VPN fast.
what instrument your bill collector will use ?
1>Laptop with Mobile InetConnection
2>Or specialized hand held tool that read the bill and send to Service center ?
1> If it is Laptop then you can create website where only authorized person can loggin and then he can insert a database. You can use HTTPs for better security.
Related
I am trying to connect to an MSTR intelligent server in Seattle from MSTR Developer running on my laptop connected in Bangalore. It takes an average of 10+ seconds for any action I do on the developer, like, login or open folders or open a report or anything. It is almost impractical to do any report development this way (not to mention the frustration).
When my colleague connects to the same instance/project from Seattle he doesn’t face any delays. So I figure that this is a network issue and doesn’t have much to do with the metadata or indexes. The network response time to the box is 30ms and 300ms average from Seattle and Bangalore respectively. I found online that 280ms is average response time from India to US. Accessing the reports and projects via the web interface is smooth though.
Have you ever experienced a situation like this before? Can the network delays cause that much trouble on MicroStrategy? Please help…
PS: This question is not quite a fit for SO. But I guess that MSTR
developers face this problem normally and may be they know a fix.
Hence posting this here rather than SU or somewhere else.
This is a pretty common problem, in my experience. I believe that MicroStrategy's network traffic is XML based, so network bandwidth as well as latency is an issue.
Usually, the web server is more responsive because:
It is performing "simpler" tasks that Developer
The network-intensive traffic is between I-Server and web server, so if they're colocated, performance will be reasonable.
I'm afraid I've never come across an effective solution to this issue. Having a "jump server" in the same data centre as the MSTR servers, with the Developer software installed, is usually the most tolerable solution (provided Remote Desktop isn't too laggy).
Same solution here : we have developpers VMs on a host in the same datacenter as the server, and we remote desktop them. From there, we use Developper/object manager, etc
You can still do 90% of the tasks in web.
If I want to stress test a 'classic' client-server (desktop app <-> LAN <-> database server) Windows Forms desktop application to see how it performs when many concurrent PC users are using it, how should I go about it? I want to simulate many PC users concurrently going through a work flow, to see if it all stands up and at what point the system degrades unacceptably. I've looked at many test tools but they all seems to be skewed toward testing functionality or web app performance, which is quite different.
Clearly having many actual people on actual PCs is not practical, and lots of virtual machines on a few PCs is not representative either. 'Cloud' computing (EC2, Azure etc) looks promising but the documentation and pricing information all seems to be skewed towards mobile apps or web servers, again not the same (but that could just be presentation so I remain open to the idea). I need to be able to virtualise a small LAN of many client machines running the application and a database server.
Can anyone suggest how to do this, or recommend something?
TIA
IMHO the real question is - do you really need to do performance testing in your case? Consider this - where is your business and functional logic?
Performance testing of Desktop applications is oxymoron by itself. Desktop application is made to be used by one person at a time. So if getting a response takes 5 seconds, it will take (pretty much) 5 seconds no matter how many users are clicking the button. The only real thing close to your backend is the DB and they by design support serious asynchronous load. In case this is not enough - just make a cluster.
Background
I am a developer that works for a health care organization. We build a variety of business apps that a majority of them contain PHI (Patient Health Information). We work on laptops in-house and occasionally have the option to work from home. Something we are discussing though is how do we handle the data stored on our laptops when we are working out of the office.
Although we have passwords and our laptops are encrypted that still doesn't seem like enough to us to protect data. What I mean by that is this. We are a small five person team. When we are working on a task we all work locally on our own databases, on our laptops. When the change is done we commit to svn and publish to a test server. Our concern is my local database is a copy of production sometimes so I can test against real data. That local database could contain thousands of records of PHI. This is obviously a major concern to us when we takes our laptops out of our building because if I have my laptop stolen, I would be putting thousands of patients health information at risk. Not something we want to do.
My Question
How do developers work as a best practice in regards to patient data safety. Or even if it was financial? Either way, how do people work with patient/customer data locally?
Is it fair to say that sometimes you just don't have the ability to connect in to a database behind a firewall or is that just negligence? Even if I keep the database internal I still have project code on my laptop. Is that bad too?
• Should I have fake data?
• Should all data be on an internal machine that you connect to?
• Should I only connect in to a machine that is internal?
I can’t imagine that is what people do all the time.
We are discussing this as a team and would love to hear your feedback in regards to "how do you or anyone work as a remote developer".
Thanks
I would like to create an app where the user can add and view data. Either adding at a desktop/tablet or phone and reading from either source. I would like the data store to be synced between any of the user's devices.
I'm starting to play with the Trial of Azure, and it looks promising. Probably a solid way to sync data through to cloud between users' devices. Other than syncing between a users devices, I have no need for cloud services currently.
I've seen some apps that do a 'Backup/Restore' model with the user's SkyDrive account. But this seems to be a manual process. I'd like to see it done seamlessly.
I've looked into Sync services, but that would be more like a hub/spoke solution. Again, I don't need a central database.
What are some options? At this point, I would be fine using just Windows 8 patterns/practices.
Because they are separate devices, you will need to have some service layer to do the store/forward for you. With that you have two basic choices, use the end user's own storage (aka SkyDrive) or use your own storage (aka Windows Azure).
SkyDrive is fully supported through the Live SDKs and provides a secure way to allow a user to share store their data, and synchronize it across multiple devices. You get security, and there is no cost for the server side storage on your part. The user owns their storage, not you. The limitation is that you may have issues sharing that same data across other devices or users where SkyDrive (or whatever file sync service you use) is not available.
With a service layer, like Azure, you have a lot more flexibility, but you also will be responsible for maintaining (and paying for) that server side storage / services. Have you looked at "Windows Azure Mobile Services". With your Azure account you get 10 free Azure Mobile Services. You will pay for the SQL data storage on the backend, and that cost will depend on the amount of data you store on the server side. You also need to make sure to architect your application in a way to protect an individual users' data, but it is actually pretty easy to do, well documented, and gives you a lot of options.
Lastly, you may consider what type of data you want to share. SkyDrive is great for "Files". Pics, Songs, Videos, etc. Windows Azure Mobile Services (WAMS) is great for "Data".
Neither model is right or wrong. It just depends on your goals.
Hope that helps you go through the thought process
I am currently working on a file system application in C# that requires users to login to a Perforce server.
During our analysis, we figured that having unique P4 login accounts per user is not really beneficial and would require us to purchase more licenses.
Considering that these users are contractual and will only use the system for a predefined amount of time, it's hard to justify purchasing licenses for each new contractual user.
With that said, are there any disadvantages to having "group" of users share one common Login account to a Perforce server ? For example, we'd have X groups who share X logins.
From a client-spec point-of-view, will Perforce be able to detect that even though someone synced to head, the newly logged user (who's on another machine), also needs to sync to head ? Or are all files flagged as synced to head since someone else synced already ?
Thanks
The client specs are per machine, and so will work in the scenario you give.
However, Perforce licenses are strictly per person, and so you will be breaking the license deal and using the software illegally. I really would not advocate that.
In addition to the 'real' people you need licenses for, you can ask for a couple of free 'robot' accounts to support things like automatic build services, admin etc.
Perforce have had arrangements in the past for licensing of temporary users such as interns, and so what I would recommend is you contact them and ask what they can do for you in your situation.
Greg has an excellent answer and you should follow his directions first. But I would like to make a point on the technical side of sharing clients on multiple machines. This is generally a bad idea. Perforce keeps track of the contents of each client by client name only. So if you sync a client on one machine, and then try to sync the same client on another machine, then the other machine will only get the "recently" changed files and none of the changes that were synced on the first machine.
The result of this is that you have to do a lot of force syncing. Or keep track of the changelists you sync to and do some flushing and then syncing.