How to achieve Database Synchronization in Adobe Air with out LCDS - air

i designed one application which pulls data from DB once its stated,if all the clients pulls data at a time from server, there is a lot of bandwidth, and one more worse case is one client may close & open his application many time a days, so there is a lot of bandwidth consumption on serve...
Is there any database Sync technique to implement in AIR desktop application ?
Please if anybody know please let me know..plz dont suggest LCDS(this is bit cost)
Advance Thanking,
Cheers,
vasu

Try and go for GraniteDS

Related

Microstrategy Developer too slow

I am trying to connect to an MSTR intelligent server in Seattle from MSTR Developer running on my laptop connected in Bangalore. It takes an average of 10+ seconds for any action I do on the developer, like, login or open folders or open a report or anything. It is almost impractical to do any report development this way (not to mention the frustration).
When my colleague connects to the same instance/project from Seattle he doesn’t face any delays. So I figure that this is a network issue and doesn’t have much to do with the metadata or indexes. The network response time to the box is 30ms and 300ms average from Seattle and Bangalore respectively. I found online that 280ms is average response time from India to US. Accessing the reports and projects via the web interface is smooth though.
Have you ever experienced a situation like this before? Can the network delays cause that much trouble on MicroStrategy? Please help…
PS: This question is not quite a fit for SO. But I guess that MSTR
developers face this problem normally and may be they know a fix.
Hence posting this here rather than SU or somewhere else.
This is a pretty common problem, in my experience. I believe that MicroStrategy's network traffic is XML based, so network bandwidth as well as latency is an issue.
Usually, the web server is more responsive because:
It is performing "simpler" tasks that Developer
The network-intensive traffic is between I-Server and web server, so if they're colocated, performance will be reasonable.
I'm afraid I've never come across an effective solution to this issue. Having a "jump server" in the same data centre as the MSTR servers, with the Developer software installed, is usually the most tolerable solution (provided Remote Desktop isn't too laggy).
Same solution here : we have developpers VMs on a host in the same datacenter as the server, and we remote desktop them. From there, we use Developper/object manager, etc
You can still do 90% of the tasks in web.

Database for live mobile tracking

I'm developing an app that allows to track a mobile device instantly (live) ... I need an of advice. The application must send the location to a webservice that in it's turn records the received data in a database.
What would be, in your opinion, the best way to store the location values?
I'm new in using bigdata and I'm afraid that simple sql requests wont be able to do the work properly ... I imagine if there is lot of users and each user send a request each 1sec I'll have issue with the database ...
An advice ? Thank you very much
i think you could have a look into the geospatial queries in mongo, if you chose to go ahead with mongodb.
Refer here
And here
for the design of the database would depend on the nature of the query (essentially the read and write).
Worth having a look into
Working at Cintric we landed on using elasticsearch. We process billions of location points in real time and provide advanced analytics to our users.
We started with mongoDB and ran into a lot of troubles, eventually leading to a painful migration.
Our stack currently has mobile devices dump location updates into AWS Kinesis, which are then processed by AWS Lambda handlers, and then dumped into elasticsearch. We're able to serve, process and store 300 million requests/month for only a few hundred dollars/month. Analytics for our dashboard add additional cost but for your needs I would highly recommend checking out your options on AWS.

MS Access slow in network share

I have a .NET application (VB.NET) that runs against a MS Access database. Every data request connects to the access database, runs and returns the query and closes the connection back again.
I placed the database on a windows xp 32-bit machine.
I have two clients on which I installed the .NET application. Both clients are running windows 7 professional 32-bit.
Now I have a performance problem with this.
When I use the first client it runs fine. All data is shown very fast. When I than use the second client, it takes some 10 seconds to connect to the database, fetch the data and close the database connection. When i ask for other data on that second client, it all runs fine, until I request data from the first client than back again. Than it takes again 10 seconds on the first client before my data is fetched.
Can anybody please help me with that? I owe a Belgian beer to the solver of this issue ;-)
Thanks!
Tom Wickerath wrote a great article on improving multiuser performance for MS Access applications. While his article assumes a MS Access front-end, many of the tips should apply to a .Net application. I recall two points that might help you:
Keep a persistent connection to the back-end
Use (short) UNC paths instead of mapped drives
After a long search, i found it out... My virusscanner NOD32 was causing this, most probably by excessive scanning inbound and outbound network traffic.
I'm not sure stackoverflow is the right place for questions like this, but ...
It sounds like the first process is locking the file, so the second process has to wait.
"Use SQL Server" isn't a completely flippant response - SQL Server is specifically designed to handle concurrency issues like this.
IMHO ...
PS:
This is a pretty lame link, but it might help:
http://office.microsoft.com/en-us/access-help/about-sharing-an-access-database-on-a-network-mdb-HP005240860.aspx
PPS:
Here's a somewhat better link, with some suggestions for things you can do to improve concurrency:
http://www.softcoded.com/web_design/upgrading_access.php

Offsite backups - possible with large amounts of code/source images etc?

The biggest hurdle I have in developing an effective backup strategy is being able to do some sort of offsite backup. Unfortunately, this can only be via uploading data to the offsite source but my internet cable has upload speeds which prohibit this.
Has anyone here managed to do offsite backups of large libraries of source code?
This is only relevant to home users and not in the workplace where budgets may open up doors.
EDIT: I am using Windows Vista (So 'nix solutions aren't relevant).
Thanks
I don't think your connections upload speed will be as prohibitive as you think. Just make sure you look for a solution where your changes can be sent as diffs. Even if your initial sync takes days, daily changes would likely be more manageable.
Knowing a few more specifics about how much data you are talking about, and exactly how slow your connection is, I think would allow the community to make more specific suggestions.
Services like Mozy allow you to back up large amounts of data offsite.
They upload slowly in the background, and getting the initial sync to the servers can take a while depending on your speed and amount of data, but after that they use efficient diffs to keep the stored data in sync.
The pricing is very home-friendly too.
I think you have to define backup and whats acceptable to you.
At my house, i have a hot backup of our repositories where I poll svn once an hour over the VPN and it takes down any check ins. this is just to catch any check ins that are not captured each 24 hours via the normal backup. I also send a full backup every 2 days through the pipe to be outside of the normal 3 tier backup we do at the office. our current zipped repository is 2GB zipped at max compression. That takes 34 hrs at 12 k/s and 17hrs at 24k/s, you did not say the speed of your connection, so its hard to judge if thats workable.
If this isnt viable, you might want to invest in a couple of 2.5" USB drives and load/swap them offsite to a safety deposit box at the bank. this used to be my responsibility but I lacked the discipline to do this consistently each week to assure some safety net. In the end it was just easier to live uploading the data to an ftp site at my house.

Webbased or Thick-client through VPN?

In an electric company where I was hired temporarily, we have to implement an upgrade of the billing and payments system ( the current system is a dbaseIII system). The company's programmer and I have decided to use VB.Net and MySQL.
The company served several towns and have billing and payments centers in selected towns. Every billing period, the meter readers would read the readings for every electric meters and then write the readings in the sheet. Every 5 pm, an employee from the centers would collect the sheets and then travel to the main center where the readings are encoded.
The billings are printed in the main center, and then distributed to the branches.
During discussions with General Manager and heads of the company, the two of us are tasked to take advantage of the internet because those towns where the centers are located have internet connectivity, and for those none, we can use the mobile internet.
The new system will allow users to enter the readings, and then send the data to the main server in the main branch. They also have the ability to download and print the billings.
Our problem now is what type of system we have to implement. Should it be web based or a desktop application that will connect to our database server through vpn.
If this is a fixed price project, and the client will accept either web or desktop, go with desktop over VPN. You'll save A TON of time, and have something that is more responsive (from a user perspective).
However, if you think the client will eventually need to use the product on mobile devices or the web, you're shooting yourself in the foot by going winforms.
Having had some experience with using a thick client through VPN, I'd say go with some kind of web app.
If done wrong, a thick client can become really painful to use through a VPN because of data churning. A web app concentrates all of that on the server, which makes it much better from that point of view.
Other benefits:
no deployment hassle
no direct access to the database from the user machine.
Evidently it also depends on your skills, and on how much time/budget you have...
I do not know the situation of the client... but what about giving them the best of both worlds? Considering it sounds like you will be programming on a windows based system, and have deployment access to windows server based hardware, why not either build a Silverlight application, or build a WPF application that's hosted in an IE window? That could give you the best of both worlds?
I think that the answer depends on the type / frequency of database queries you need to make. Querying a DB from a thick client through VPN can be SLOOOOOWWWWWW. In a web app, the application logic runs close to the DB, maybe even on the same machine, so DB queries are fast. The downside is that UI can be slower. But it is probably easier to design a responsive web-based UI than make VPN fast.
what instrument your bill collector will use ?
1>Laptop with Mobile InetConnection
2>Or specialized hand held tool that read the bill and send to Service center ?
1> If it is Laptop then you can create website where only authorized person can loggin and then he can insert a database. You can use HTTPs for better security.