I have started experimenting with FileTable in SQL Server. I got it working once, and was able to add and delete files via the share name, and delete files by deleting records in the proper FileTable.
However, I had names that I didn't like (test, table1, table2, etc.), and I had it on the (SQL Server default) system disk instead of the data disk, so I deleted everything and started over, intending to set everything up properly, since (I thought) I now understood enough about it to use it. But my new setup doesn't work, and I can't figure out why.
Specifically, I am unable to access the share created to hold the FileStream, neither locally when logged into the server, nor remotely from other machines. It shows in the list of shared folders, but attempting to open it gives me only an error message that I do not have permission to open it.
Both machines run under the supervision of a domain controller, and my account has local admin privileges in both. There is one instance on the server, and three databases in the instance - production, development and attachments. I regularly turn the upgraded development version into the production version, so I created a separate attachments database, which is where I put the FileTable tables - accessible from both production and development, but not duplicated, since the number of attached files and their contents are both large.
I have been reading all sorts of things on the net, and permissions are regularly mentioned, but I seem to have permissions. I can query the FileTable in that third database (with no results, because I have been unable to open the share to put files in it, but also no error, so I do have at least read access). And it is not the individual FileTable names inside the share that are inaccessible, but the entire share itself.
Can anyone give me an idea of what I may be overlooking? I had it working once, so it's not some fundamental problem with the configuration of the server.
Related
Hello i have a database created with only Microsoft access (meaning no sql have been used) is it possible to make multiple users use it from different computers and the datas they input gets updated in all the computers?
Can someone just briefly tell me how if the answer is yes,
Much appreciated
I have on several occasions used the following technique with success:
(1) Split the Access Database in two:
The Back End: This database should contain the shared tables.
The Front End: A database for forms, queries and basically everything except tables.
Instead of actual tables, this database should contain "linked" versions of those tables which are held in the "back end".
(2) There is a central copy of the front-end database, but no-one opens this directly. Instead, they run a batch file which creates a local copy of that central front-end, and then opens that.
This setup has the advantage that the central "front-end" remains unused, and therefore isn't locked, and so the developer can edit it. The users will get the updates whenever they next launch the database using the batch file.
A second advantage is that the "backend" can be upsized to a "proper" database, and the front-end could then remain largely unchanged, just that the linked tables would no longer be in another Access Database.
I have a business case where I am developing a simple search UI, I would like to link it to our SQL Server as the performance is pretty fast when I test it. My plan is to create a few linked tables and create a tidy search form for each linked table (different datasets).
UPDATE, here is a better description of my plan
I have a single user ID / Password that I want to use in each ODBC connection on 4 linked SQL tables (its considered an APP ID at my company, the PW never changes). There will be 4 forms that link to each of the tables and each user will have their own accde db with a launch file that places a copy on the users profile drive and opens it from there. This allows each user to have their own copy of the accde file and everyone to have only one "launch" file.
This search UI, will have upwards of 2000 users, who knows how many actually executing a search at any given time. Security is not a concern as it is a DB on an internal SQL Server which is managed by our IT area. The end users are all internal employees.
Will using just the one ID potentially lock out my APP ID and cause major issues?
Will MS Acess no longer be a major choke point if each user has their own accde file?
Thank you and sorry that my first version of this question was not 100% clear, thanks!
So, I figured I would circle back and post up what I did. While providing a singular app ID with a File DSN in a shared location for the Access Front end would have worked, it was ultimately not the most stable solution.
Since I am in a large corporate environment, my options were extremely limited. That said, I was able to have a read only role added to the database I manage, sourced an "Active Directory Group" that had the membership I needed (as a bonus, the membership was managed at the corporate level!) and I added the AD Group to the read only role.
I then created a File DSN using windows authentication security, placed it in a shared folder location (where I also added the same AD Group to a read only role on the folder) and emailed out a shortcut to a simple batch file launcher that copies an ACCDE database to the users profile drive.
The accde houses all the necessary search forms, logic and linked tables needed for the end user. I even build in a back door that crashes out the end users (with a warning) with a simple file rename. The front end runs surprisingly fast with the test group of 100 individuals and is rolling out to 500 next week.
Viola. Hope this helps someone trying to do something similar.
How do I handle updating my SQLite database file when I update my application. Basically when I update, the database needs to be updated to. I thought about adding a table to the database that would have the database version, but my only concern with that is how would I read from two databases with the same name. One would be the one in the Documents Directory (the old one) and the new one (with the same name) would be in the Main Bundle. The application it self does not require any user preferences to be stored in the database so I dont have to worry about over writing users data. But the application does make updates to the database from the internet, but these will ultimately be incorporated into the next release. Any have any tips on how to handle this?
You could use 'NSFileManager' to copy the database from the application bundle into the Documents directory before you open it.
We are not hosting our databases. Right now, One person is manually creating a .bak file from the production server. The .bak then copied to each developer's pc. Is there a better apporach that would make this process easier? I am working on build project right now for our team, I am thinking about adding the .bak file into SVN so each person has the correct local version? I had tried to generate a sql script but, it has no data just the schema?
Developers can't share a single dev database?
Adding the .bak file to SVN sounds bad. That's going to keep every version of it forever - you'd be better off (in most cases) leaving it on a network share visible by all developers and letting them copy it down.
You might want to use SSIS packages to let developers make ad hoc copies of production.
You might also be interested in the Data Publishing Wizard, an open source project that lets you script databases with their data. But I'd lean towards SSIS if developers need their own copy of the database.
If the production server has online connectivity to your site you can try the method called "log shipping".
This entails creating a baseline copy of your production database, then taking chunks of the transaction log written on the production server and applying the (actions contained in) the log chunks to your copy. This ensures that after a certain delay your backup database will be in the same state as the production database.
Detailed information can be found here: http://msdn.microsoft.com/en-us/library/ms187103.aspx
As you mentioned SQL 2008 among the tags: as far as I remember SQL2008 has some kind of automatism to set this up.
You can create a schedule back up and restore
You don't have to developer PC for backup, coz. SQL server has it's own back up folder you can use it.
Also you can have restore script generated for each PC from one location, if the developer want to hold the database in their local system.
RESTORE DATABASE [xxxdb] FROM
DISK = N'\xxxx\xxx\xxx\xxxx.bak'
WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10
GO
Check out SQL Source Control from RedGate, it can be used to keep schema and data in sync with a source control repository (docs say supports SVN). It supports the datbase on a centrally deployed server, or many developer machines as well.
Scripting out the data probably won't be a fun time for everyone depending on how much data there is, but you can also select which tables you're going to do (like lookups) and populate any larger business entity tables using SSIS (or data generator for testing).
Scenario
In our replication scheme we replicate a number of tables, including a photos table that contains binary image data. All other tables replicate as expected, but the photos table does not. I suspect this is because of the larger amount of data in the photos table or perhaps because the image data is a varbinary field. However, using smaller varbinary fields did not help.
Config Info
Here is some config information:
Each image could be anywhere from 65-120 Kb
A revision and approved copy is stored along with thumbnails, so a single row may approach ~800Kb
I once had trouble with the "max text repl size" configuration field, but I have set that to the max value using sp_configure and reconfigure with override
Photos are filtered based on a “published” field, but so are other working tables
The databases are using the same local db server (in the development environment) and are configured for transactional replication
The replicated database uses a “push” subscription
Also, I noticed that sometimes regenerating the snapshot and reinitializing the subscription caused the images to replicate. Taking this into consideration, I configured the snapshot agent to regenerate the snapshot every minute or so for debugging purposes (obviously this is overkill for a production environment). However, this did not help things.
The Question
What is causing the photos table not to replicate while all others do not have a problem? Is there a way around this? If not, how would I go about debugging further?
Notes
I have used SQL Server Profiler to look for errors as well as the Replication Monitor. No errors exist. The operation just fails silently as far as I can tell.
I am using SQL Server 2005 with Service Pack 3 on Windows Server 2003 Service Pack 2.
[update]
I have found out the hard way that Philippe Grondier is absolutely right in his answer below. Images, videos and other binary files should not be stored in the database. IIS handles these files much more efficiently than I can.
I do not have a straight answer to your problem, as our standard policy has always been 'never store (picture) files in (database) fields'. Our solution, that applies not only to pictures but to any kind of file, or document, is now standard:
We have a "document" table in our database, where document/file names and relative folders are stored (in order to get unique document/file names, we generate them from the primary key/uniqueIdentifier value of the 'Document' table).
This 'document' table is replicated among our different suscribers, like all other tables
We have a "document" folder and
subfolders, available on each of our
database servers.
Document folders are then replicated independently from the database, with some files and folders replication software (allwaysynch is an option)
main publisher's folders are fully accessible through ftp, where a user trying to read a document (still) unavailable on his local server will be proposed to download it from the main server through a ftp client software (such as coreFTP and its command line options)
With an images table like that, have you considered moving that article to a one-way (or two-way, if you like) merge publication? That may alleviate some of your issues.