My database must be updated at arbitrary intervals, manually, with new info to put on standart tables that don't require structural modifications. one app will update the database.
Another app, the one I will distribute (to friends and family only, but doesn't require any security feature to it, like encrypting the database) will read the database and use its data on labels, listviews, etc.
The problem is, I'm the perfect definition of full-fledged n00b at programming of any sort, and still don't know what database to use.
I thought that I should use a SQL CE (*.sdf) file, and store that database thing on an FTP. then, I could download it and get data from it everytime the "client" app runs, and certain button ("connect") is clicked.
After some hard-core googling, I found out how to connect to the sdf thing, using this connection string:
Provider=Microsoft.SQLSERVER.CE.OLEDB.3.5;Data Source=D:\Documents and Settings\Camilo\JCTM.sdf
So it connects, or at least didn't show any error.
I don't know if it's a good idea to use sdf SQL CE files as databases, if it's too hard maybe I should go for XML? what do you guys suggest, what is the easiest way to implement very simple databases in VB.NET?
By simple databases I mean:
- no search needed
- no advanced features except storing strings on tables with columns and rows
- easy to access, read, edit, etc. by different VB.NET apps
Is sdf a good idea?
I would recommend Sql Server Express Its free and can be redistributed with .net applications as part of the install process.
The challenge will be syncing the changes between the different clients. If you have access to a FTP server, you may have the ability to host a website in IIS. If you can do that you can just use webservices and read against one database instead of copying one local.
Luckily for you, you can abstract away the need to be concerned with which back-end database you use to store your data.
Technologies such as ODBC (Open Database Connectivity) and OLEDB (Object Linking and Embedding, Database) allow you to limit your concern for the backend datastore to the task of crafting your connection string. In your example, the connection string says, "I'm going to connect to a SQL Server CE database via its OLEDB provider, and it's physically located over on D:/...
Within the code, you use standard OLEDB mechanisms to access and manage the database. Because of this abstraction, you can use OLEDB providers for SQL Server, Oracle, XML, Access or comma delimited text files as your backing store if you wish, and the only change you need to make to your code is the connection string. Your choice then should be to pick the database that you have the tools and know-how to set up and manage initially.
I'd start with Microsoft Access because it has its own UI, and can play well with .NET.
You can also try the ADO.Net implementation for SQLite, which I've also found very useful.
Related
I am currently creating a vb.net program in which users upload a song file to the program and then it is saved within the programs files. I have set up the actual saving of the files but would also like to store some meta data of each in a SQL database within my program.
I have looked online and although i now understand the basics of SQL, im still a little fuzzy on how you actually implement this within VB.net. I have already added the library- Imports System.Data.SqlClient but failed to work out how to begin coding in SQL.
The basics of what im trying to acheive is a if statement that will determine wether or not a SQL database has been created in a specific location, and if it hasnt it should create it.
All constructive answers appreciated, thanks.
There are a number of different database engines available. The namespace that you have chosen contains the ADO.NET client classes for Microsoft SQL Server. You would use a connection string to specify how to connect to the database. This would often contain connection information, such as server name, user name, password etc, but it sounds like you want to store data locally.
There is a local version of SQL Server called LocalDB, but I think you would still need quite a lot of the SQL Server components installed for that to work. Although you can package these with your application they may be too large for you, so you may want to look at SQL Server Compact Edition, which is much smaller and allows you to package the whole engine as part of your application and is useful for storing data locally. Compact edition doesn't have quite all of the features that LocalDB does, so you may want to compare the features available for each.
Although you can use the ADO.NET objects to connect to a database, I think most people these days would use a layer on top which transfers data back and forwards between objects in memory and the database. This also allows you to use Linq to query the database in most cases. I personally use Entity Framework. You might want to look into that. There are different ways of configuring EF so you may want to look at a tutorial. Once you have it set up, you will probably find it much easier and safer to work with than writing SQL manually though.
I have a vb.net windows form application with a database on SQL Server 2008 on the ./SQLEXPRESS instance.
I have created a setup of my project using the link below..
http://msdn.microsoft.com/en-US/library/49b92ztk(v=vs.80).aspx
When a user installs my application, the database will be available for him, and user can just export the SQL Server database.
How can I secure my database so that user shouldn't have a easily available copy of my database?
I thought of creating a new password protected server (as I have created the database in above walkthrough)... while installation of my application on user's pc, other than ./sqlexpress. And a complete copy of database used by my application will not be simply available for user to just export and get a copy of my database.
So could anyone please guide me...
The question is; how far do you want to go to protect your data?
Better protection of your data usually comes at the cost of more development time and likely less user friendliness, for example due to lower performance (encryption is not free). More complex code usually results in more support requests too.
Where the best balance is depends on your business model (if any) and on your user requirements.
Keep in mind that anything you deploy to an end-users machine is in the end vulnerable. If something is valuable enough there will be people trying to steal it.
So, you could argue that the best protection is not to deploy the data at all. You could back your end-user application with a web service and keep the data on your own server, for example in the cloud.
I've found however that you sometimes just need to trust your users. If you build a good product that makes them happy, they have no reason to steal from you. In fact, they are probably glad to pay you.
If you decide that you need to deploy the data and that you need to encrypt it, you should think about why you chose SQL Server.
What database features do you need exactly? Do you need a fullblown database server for that?
Any local admin can gain control over any SQL Server database in seconds so the built-in SQL server authentication will not bring you a lot of benefits.
You could switch to SQLServer CE and keep the database within your application. That would make the database a lot harder to access for a regular user.
If all you're doing is looking up words, you may be better off with a different storage engine like Lucene.
Lucene is actually a search engine, so it's highly optimized for matching words or parts of words.
You can run Lucene inside your .NET application so you don't even need the end-user to install SQL Server. There is a .NET version of Lucene here.
Lucene however doesn't protect your data. There's tooling available that will allow anybody to view and extract the data from the stored index files.
Since Lucene is open source though, you could extend it to support encrypted data storage (see this related question).
What concepts/steps shall i need to learn to access a central database(server). The system will have a software(manipulate by users) which access the database in the server. System like inventory systems, billing, etc..
First of all you need to know vb/C#.net (which I assume you know). Next you must be able to write SQL queries, stored procs, T-sql, etc. to communicate to the database. Finally you need to know ADO.net to communicate between your application and database.
You can start your ADO.net study from here:
http://msdn.microsoft.com/en-us/library/e80y5yhx.aspx
http://www.codeguru.com/vb/gen/vb_database/adonet/article.php/c15033
Here is a link which will help you under stand the data access layer: http://msdn.microsoft.com/en-us/library/aa581778.aspx
Hope this helps...!!!
We are using several text files as Templates to create the results of a WCF Data Services - Service Operation call.
The text files are each less than 3000 Bites Max.
What are the pros and cons of storing my template files on the file system with the WCF Data Services files vs storing them in a SQL Server 2008 R2 server?
Prior to SQL Server 2008, I recommended strongly against storing large objects like text files in the database. It tended to slow down access and made them generally harder to work with. Instead, I generally recommended storing links to the files in question.
Of course, this meant that the database would not protect the files in the event someone deleted something they shouldn't and the files needed to be backed up and transferred separately from the database.
With SQL Server 2008, I think many of the former problems have been overcome using the filestream functions and I think that storing files using filestream can be quite useful at times. It continues to store the actual data outside the database, which avoids many of the former complications. But it still binds the two together and permits the database to protect the files rather than just relying on the links in the database to be correct.
There's a lot of pros and cons for either storage method. Nowadays (my opinion has changed, and may change again some day), I'd focus on security and managability.
If it is sensitive data, you might get a bit more security by storing them in the database. If nothing else, it might be more difficult to hack a database than a file system. If security is not so important, it can be easier/simpler to store it on the OS.
For managing, if the data gets updated (and how frequently does that happen), how easy is it to update? One instance in a database is simpler to update (or corrupt...) than an instance on each of however many servers are in your web farm. (1 server, no problem, 20 servers, possible headache.)
I think it's better to store the data directly in the database. This makes it even faster to access because a database is more efficient in reading and generally handling data. You could always store movies in databases - that's no problem. Then it's also possible to stream large data.
For security reasons existing more as enough options to configure your database. And if you cluster your database - this is even more scalable.
I've written a small (8-10 laptops) point-of-sale system running over a wireless network, as an HTA that reads from/writes to an Access MDB located on a network share.
I need to use ADO - GetString and the user roster are not available with DAO.
I also need to use DAO - the MDB cannot be compacted with ADO.
I know that:
1) If the database backend is not an Access MDB, I should use ADO.
2) If the backend is an MDB, but I want to upgrade to SQL Server at some point, I should use ADO.
3) Within an Access application, or any other VBA/VB application, I should use DAO, as ADO must go through a translation layer of the Jet OLE DB Provider, while DAO is more direct.
4) VBScript/JScript allows me to use either DAO or ADO.
The two-part question is as follows:
1) In this software environment (HTA/scripting), is it better to use ADO rather than DAO?
2) Does ADO offer any benefits because the HTA is reading/writing over a wireless network?
If the only reason you need DAO is to compact the database, you can use DAO for that, and use ADO for everything else. You are not limited to using only ADO or DAO.
The biggest benefit of using ADO is that it will be easier to move to SQL Server Express when the time comes. You should do that sooner rather than later, as SQL Server Express offers all of the benefits of MSAccess databases without the drawbacks. SQL Server Express is free, and it will easily handle the system size you are proposing.
Access databases corrupt easily in a multi-user environment, especially when a wireless network is involved. If you are worried about losing the benefits of working in MSAccess, you can still attach to SQL Server using linked tables, and work with your SQL Server Express database that way.
You can also use JRO to compact your MDB file. This will be included with any recent version of MDAC, installed by default on XP and later systems. No installation of Access is necessary.
To answer your specific questions:
1) I would opt for ADO just because it is more current, and the same API can be used in other kinds of scripting, like LDAP/ActiveDirectory access, reading file system folders, reading MAPI mail, and working with other types of semi-structured text files like fixed width text and CSV. It's not specifically better for the HTA programming environment, but its perhaps better for you to learn a more widely applicable API. I also thinks it's an easier API to work with, but I started with it and only later worked on some older DAO projects.
2) One possible benefit that ADO provides is that of disconnected recordsets, which may have an advantage or at least suggest some architectural alternatives in your wireless network setup. You open a recordset, then disconnect it, so you can still work with the data in memory, but not have to leave a database connection open. Then at a later time you can reconnect and update the database. Also, you can work in a fully disconnected style by managing tables as local XML or ADTG files.
You might shoehorn DAO into working from VBScript but that's an odd pairing. ADO makes the most sense in general.
Another advantage of ADO would be that it supports RDS over DCOM or HTTP. This can be used to overcome many of the limitations of Jet MDBs used via file sharing, such as the corruption unreliable networks and clients can lead to. It also cuts the amount of traffic over the network, improving performance. In addition it offers a middle tier in which business objects can "live" and all of this can be mediated using COM+ where applicable.
Of course you no longer have the option of using a simple P2P network and a file share to host the database. RDS needs a server to host the process and run the Jet engine, which no longer needs to run on each client system. This means you can use Jet stored procedures that run on the server, offloading more client processing and network traffic. While not as sophisticated as T-SQL or other alternatives, this ADO/Jet 4.0 OLE DB Provider exclusive technology offers tangible benefits that can't be had using DAO.
RDS can mask much of the process of using disconnected Recordsets, simplifying client code. It uses ADTG under the hood, which was developed and optimized for this very purpose.
However using RDS requires more infrastructure and expertise than a simple file share. You might as well look into a low-end version of SQL Server.
In general I'd recommend using the jetcomp.exe utility to compact and repair, over either DAO or JRO. It offers a number of advantages.
If you can afford to use a HTA application with Access instead of a more performant platform, I would say you should go with the easiest api for you. The bottleneck will never be the abstraction of db access in your case. Still, DAO is really old.