We've been on SharePoint 2007 for close to 2 years now. We find it's a great CMS that helps us centralize project documents and colaborate with less duplication and confusion. The custom list feature offers a quick and dirty alternative to custom form and developed sql solutions sometimes.
That said, we still have over 100 Terrabytes of files shares with documents dating far back to the beginning of time outside of SharePoint.
As we look forward to smarter, faster and bigger Network FileSystems...
(1)What realistic role should SharePoint play?
(2) How reliable can SharePoint be as a seemless place to store documents? I mean, will we be able to save/retrieve documents to sharepoint from all popular clients without having to open sharepoint?
Part of why I ask ... A few months ago as part of another project, we attempted to use SharePoint 2007 like a file share.. attempting to set up a Windows drive map and UNC path and/or WebDAV. Our findings were that not every client (XP, Vista, 7, Mac. IX, etc) plays nice with UNC, WEBDAV and drive mappings. Not surprised. Does this change significantly looking ahead to future sharepoint releases?
(3) Are there documents that have no business in SharePoint? Databases? Executables? propietary logs? What about documents where we expect lots for row level IO from potentially multiple users?
(4) How many customers would you say are seriously looking at SharePoint as a significant alternative to file shares? I understand SharePoint DBs should not exceed 100g - so we have a DB for every site collection. But we have over 100T of potential content. If there are customers seriously looking to go this way - what might there archetecture look like? Blob storage outside of SQL? EBS vs RBS? Who are the major players that offer this and will SharePoint ever offer this natively? EMC? StoragePoint? who else? EBS vs RBS?
(5) What about performance and content indexing concerns?
Thanks in Advance.
If you're seriously consider BLOB storage and SharePoint, then your should look into Remote Blob Storage. See Overview of Remote BLOB Storage. Besides the free FILESTRAM based provider, there are 3rd party providers of RBS that can place the BLOB on SANs like the EMC one.
Metalogix's StoragePoint has an offering called FileShare Librarian that may be the answer you are looking for, it will quickly create the file structure and permissions in SharePoint, while leaving the BLOB's externalized. There is all FileShare Migration Manager for a full fidelity migration, you can still externalize the blobs to EMC with StoragePoint.
Related
Is there a way to do this easily? Keep version history of the document on the document? As a .gdoc or .whatever-format or am I resigned in having to download, separately, all the revisions made in the past on said document?
For context: I have a document I've been editing and revising over the years for my own medical history and list of meds, history, etc. etc. and have been using Google Docs to do this, because it was convenient and I didn't have to pay for Microsoft Office and additionally install a good word processor on my PC. Now recently I've purchased Dropbox Personal for cloud storage needs.
I want to do the following: Take the Google Doc and save it as a .gdoc (which isn't an option in the File menu??) and take it over to Dropbox's Vault as an editable hardcopy with its revisions history in tact.
Otherwise, what I have done (before I even comprehended revision history was a thing) is just copy pasted its current version, onto a new .gdoc in Dropbox Vault.
So, is that possible? And if so, how and as easily (lazily) as I possibly can? Also, is this even the right place to ask for this? Apologies if it isn't. I didn't see much else about this specific issue anywhere... (also lazy)
Thanks!
EDIT:
I am by no means a coder in any sense. I'm a full time elderly caretaker and I'm just a guy with a specific, niche?, technical, problem and thought this was the first place to ask without having to go through tech support w/ Google chat etc. And it might also help some other people that like seeing how their documents have changed over the years, history fans etc. At the end of the day it's a programming/coding issue, that could be resolved someway some how... Right?
If I can add pictures here for context, LMK.
Thanks :)
The .gdoc file format is only accessible via Google Docs which is on web. Downloading the file to your local storage means you would have to access it on your device using your local apps (word editor) such as Microsoft Office,Libre Office, etc.(other word editor apps on desktop application level) which is why the .gdoc format is not available when you download. This is also why you won't be able to have it openable from your dropbox.
The version/revision history on Google Docs is intact only to that specific file with that unique ID. So when you download the file, the version history won't be available to the physically downloaded file which is stored on web or even when you make a copy of it, the version history does not get copied, therefore that won't be an option too.
It looks like you'll have to stick to manually copying or making a backup of the current version of the file before editing, since the version history is only kept for a period of 30 days or the last 100 versions, unless manually set to "Keep forever" to keep a version forever.
Google drive version history: https://googledrivepro.com/google-drive-version-history/
The machine I'm trying to access is an industrial PC that serves as the interface for some PLC-automated equipment. The computer creates a data record (an .mdb database) of the various machine settings so that they can be reviewed later if desired. I've created an application in Visual Basic 2010 to display and sort that information once it's been copied from the machine to someone's laptop.
What I'd like to do, though, is allow the user to access the database from their laptop over the network; each PC has a static IP address on the customer's LAN. Currently, we can use Teamviewer to transfer files, but I'd like to include that ability in my data viewing application. Without changing any settings on the industrial PC (i.e. leaving the network file sharing alone, and avoiding installing any sort of SQL server software), how can I access this information? The database files can get pretty large (40+ mb) so I'm trying to avoid any sort of FTP transfer, which would undoubtedly take forever.
The database is already set up on an ODBC connection (which is how the pc stores the PLC information in the database), and I suspect that this may be the key, but between my lack of any thorough understanding regarding networking and the internet's general distrust of ODBC (well earned, from what I've seen so far), I'm having a hard time finding any useful information.
If anyone could point me towards some useful tutorials, or give me a good place to start, I'd appreciate it.
I think you are out of luck. Think about it, if you could access the file, without sharing it, wouldn't that be a serious security problem?
What you can do: Use the always existing share \host\c$ to access the file. For that you'll need administrative privileges for the host.
But be aware: Even if this "works", it introduces problems:
It will be slow, because you are essentially copying 40MB over the wire over and over again
If you are not carefull, you might lock the *.mdb file while accessing it. Access Databases are not really meant to be accessed by multiple users/processes. That didn't prevented Microsoft to try to fake it. Todo so, a *.ldb (L as in Lock) file will be created, to signal other processes the file is locked.
Tldr: Don't do it.
I'm a little bit confused over the various types of storage that is available to Windows Store Apps.
Let's say I had a notepad app, where users can view, create, and edit notes. What storage type would I use for storing the notes? Local storage? Write the notes out to files in the user's Documents folder? Also, what if I wanted to sync a user's notes via the cloud? I understand that Roaming Data has a rather low size limit.
Almost all the options you mention are possible for a notepad application. Except the roaming data option, that only allows you to store 100KB of data.
I will try to sum up the options that you have and add a few more:
Localstorage
You can easily add these files to localstorage, you can store it in file format or serialize your object and store that one. Very easy to implement. Con is that only your app can access these files.
Documents folder
Also an option. Made easy by use of different filepickers. For example the FileOpenPicker or the FileSavePicker. Files can be stored in the format you like and can be accessed by other apps or through the file explorer.
Roaming data
No option for files due too the limited space
Skydrive API
If you want to store files in the Cloud and access them anywhere you could consider the skydrive api. Also note that if you use the filepickers you also have the option to save/load these files to skydrive. (Although in that case the user chooses where to store the file.)
Windows Azure Mobile Services
Another option if you want to store data in the cloud. Gives you the ability to store your data in a table/tables. Very easy to implement. More info about mobile services can be found here
SQL Lite
If you need a local database to store your data than SQLLite can be an option. Tim Heuer has wrote a nice blogpost about how to use SQLLite in your windows 8 app. You can find it here
Hopefully this clears up things a bit and gives you some ideas about what to choose for your app?
In an app like this (a notepad style app), the logical place to store you files in in the user's documents folder. That way they are accessible to the user from other apps as well as the current one. There is, of course, the option to roll your own methods to upload the data to SkyDrive as well, but you really shouldn't rely on this as being your only data source - what if the user is offline?
I was wanting to use a file sharing server to keep certain files up-to-date and constant across multiple instances of my application across multiple computers - like (for example) writing a multiplayer game, which stores all the player's positions in a text file, and uses something like Dropbox to keep the text file constant across all the applications, and each application instance can change the file with that application's player's position, and then the rest of the applications can update accordingly. This is only an example, and is not what I intend to do using this technology. What I want to do does not rely on fast sharing of data very quickly - but only periodically downloading and updating the text file.
I was wondering how I might be able to do this using the Dropbox API for Objective-C without prompting the user for any Dropbox username/password - just store a single Dropbox account's login information, log into it automatically and update/download the file stored on it?
From what I have found out from experimenting, Dropbox prompts users for their passwords via a web-broswer, and is designed to accommodate multiple accounts, whereas I only need to accommodate the 'Server' account.
So, is there anyway to do this sort of thing using the Dropbox API, or should I use something else. Or do I need to find out how to write my own server. Using some sort of file sharing API seems a lot easier to me than writing an actual server.
Thanks for any help,
Ben
You might think about using Google App Engine (GAE). I had a similar requirement recently and I'm thinking this is a good option when you want centralized data. Plus you can do the no-browser account login by using your own custom authentication, or I think it's even possible via OAuth? Depends on how sensitive the data is I guess. I just rolled my own.
From my research I found that using Dropbox as a server has some issues with scalability, since you'll be limited to maybe 5,000 calls per day. source It's built on Amazon S3, so you could also look at using that directly.
GAE lifts that limit up to 675,000, but can be increased up to 91 million for free.
https://developers.google.com/appengine/docs/quotas
I did find an open-source project for doing this with Java, alternative you could look at Python example
I've written a daemon that continuously checks for updated files and syncs them. I wrote it for my own file manager iOS app. You can find the implementation here:
https://github.com/H2CO3/MyFile/tree/master/DropboxDaemon
I'm personally not an iOS developer but I came across this question while looking for something else and thought I would offer up another potential solution to the OP's question.
Microsoft just released something called Azure Mobile Services which supports iOS development (among other platforms). It's basically a convenient way to set up a back end system complete with push notifications, authentication, etc. without rolling your own. You don't need to know anything about Azure or servers as the setup process walks you through most of it. It is new so keep that in mind, but it looks promising for situations like this.
Here's a 10 minute video explaining how to use it with an iOS developed app along with links to more documentation:
http://channel9.msdn.com/posts/iOS-Support-in-Windows-Azure-Mobile-Services/
Hope this helps.
I recently came across a problem for image file storage in network.
I have developed a desktop application. It runs in network. It has central database system. Users log in from their own computer in the network and do their job.
Till now the database actions are going fine no problem. Users shares data from same database server.
Now i am being asked to save the user[operator]'s photo too. I am getting confused whether to save it in database as other data or to store in separate file server.
I would like to know which one is better storing images in database or in file server?
EDIT:
The main purpose is to store the account holder's photo and signature and later show it during transaction so that teller can verify the person and signature is correct or not?
See these:
Storing images in database: Yea or nay?
Should I store my images in the database or folders?
Would you store binary data in database or folders?
Store pictures as files or or the database for a web app?
Storing a small number of images: blob or fs?
User Images: Database or filesystem storage?
Since this is a desktop application it's a bit different.
It's really how much data are we talking about here. If you've only got 100 or so users, and it's only profile pictures, I would store it in the DB for a few practical reasons:
No need to manage or worry about a separate file store
You don't need to give shared folder access to each user
No permissions issues
No chance of people messing up your image store
It will be included in your standard DB backup
It will be nicely linked to your data (no absolute vs. relative path issues)
Of course, if you're going to be storing tons of images for thousands of users, I would go with the file system storage.
I think you have to define what you mean with better.
If it is faster my guess you don't want to use a database. You probably just want it plain on a file server.
If you want something like a mini-facebook, where you need a much more dynamic environment, perhaps you are better of storing it a database.
This is more a question than an answer, what do you want to do with the pictures?