Dealing with sensitive password data in Perforce - repository

What is the best practice for dealing with files containing sensitive password data in Perforce?
In my team's projects, we are trying to eliminate such files from the repository.
Are there any Perforce specific features/conventions that could help?
There is some xml files which contain code (not just passwords), which need to be under revision control. How are such files handled?
The following posts specific to git seem to be a good starting point:
Remove sensitive data
Best practices for github

You can use p4 obliterate to remove the offending versions of files containing passwords. You could use .example files containing example/fake passwords. Use your favorite scripting language to replace the passwords locally and create the real files. You could add p4 protect permissions to disallow particular files from being checked in.

Related

Dropbox API - Are the File IDs unique?

We are using the dropbox REST API to migrate file from dropbox repo to our platform. But the thing is that different files have same IDs, we are using the IDs and path to create the folder/file hierarchy and replicate the same in our platform. Now, because of this issue, some files are seen in different folders and not in the expected folder. Is this a bug or should we rely on the file paths entirely?
File IDs in the Dropbox API are indeed expected to be unique. That is, files at two different paths shouldn't simultaneously have the same ID.
If you are seeing that, that would be a bug, which you can report here:
https://www.dropbox.com/developers/contact
However, note that file IDs are case sensitive, and it's possible for some file IDs to differ only be case, so make sure you're using case-sensitive comparisons.

TRAC, hide a project in available projects page depending on permissions

I have multiple projects in TRAC. I'm using mod_wsgi, and my wsgi script file TRAC_ENV_PARENT_DIR variable is pointing to the folder containing folders with all these projects. A few users have access to different projects. When a user visits the TRAC URL, she can see the listing containing all these projects, yet has no access to some of them.
Is there any way to show to a user only those projects this user has access to?
Please advise.
Preamble: I abhor security through obscurity. Your request could be read as cosmetics in web site presentation. Don't aim at improved access control, because knowing a valid path will still give access to each Trac environment depending on it's settings. Of course better navigation is a good reason.
Requiring to hide folders depending on user's permission means you require authentication before granting access to TRAC_ENV_PARENT_DIR. This could be done with standard mechanisms that your web server supports. This is just the precondition.
As you say, you have some non-public Trac instances in your Trac environment folder collection. How complicated it is to identify all folders correctly, that depends on how much you want to spend on initial implementation vs. maintenance.
I should be trivial, but error-prone, to provide a list of either the public or the private directories, of course whatever is easier to maintain. Zero additional configuration would require to open each Trac environment and look up user permissions. )** This sounds rather cumbersome and means probably a performance penalty for applications with large user base and frequent access. You will at least work with a cached list, if you go down this road.
You can't use Trac's auto-generated Available projects list but you'll have to deliver at least two versions of an index page for authenticated/unprivileged and authenticated and privileged users.
For the sake of maintenability you'll want to consolitate configuration and permissions. For access to each Trac environment you could use trac.ini inheritance and a shared .htpasswd file. However you can't inherit permissions, because these settings are stored inside the Trac db. You could give TracUserSyncPlugin a shot, but it seems not yet fit for production, or at least lacks feedback of all the happy users, if they exist.
)** While I'm not aware of dedicated documentation about this, there are actually several possibilities. Since permissions are stored in the Trac db, all involve reading/querying the permission db table. It's structure is documented with all other tables of the Trac db schema. To read you'll want to open the Trac environment(s) and then use a direct query on the table (see a AccountManagerPlugin changeset for an example) or construct and query a PermissionCache object.
It may be an old question, but so far i've found the answers to be rather complex without need.
I think using the information stated here, http://trac.edgewall.org/wiki/TracInterfaceCustomization#ProjectList , one could build a template that checks for users and permissions and then show the data it should.
In my case, i just needed to point the "TRAC_ENV_INDEX_TEMPLATE" variable to blank HTML, and that was enough for me.

File permissions on a web server?

I'm new at writing code for websites. The website allows users to upload files, such as profile pictures or other pictures. The files are saved in the unix file system and the URLs to find those images are stored in a MySQL database.
It seems like the only way I can let the user upload files is to give write access to anybody using chmod. Otherwise it complains that it doesn't have write permissions. But they shouldn't be able to write whatever they want or overwrite other users stuff. Similarly, to allow users to see images that they have rightful access to, they need read permissions on the file system. But now that means that anybody with the url to that picture can see the image too, correct? That's not what I want.
Is there a solution to this contradiction? Or am I thinking about the problem incorrectly? Thanks for any help.
You need to manage the permissions in your application and not expose arbitrary parts of your local filesystem directly to the clients. Your application should decide what files someone can see or where to write data. You should not trust data (filenames, etc) from your clients...ideally, store files on disk using systematically generated names and store human-readable names in the database.
SunStar9,
Since you are already using a MySQL database to store the URL of the image on the file system, why not just store the image itself as a BLOB (binary large object)?
This is generally a well-accepted design practice for allowing users to upload binary data to a website.
Are you using PHP, Java, Ruby/Rails, or something other to develop your website? Depending on what you are using, there could be file upload/management plugins or modules that will help you develop what you are trying to do if you are certain you want to use the files ystem for storing the image data.

Storing uploaded content on a website

For the past 5 years, my typical solution for storing uploaded files (images, videos, documents, etc) was to throw everything into an "upload" folder and give it a unique name.
I'm looking to refine my methods for storing uploaded content and I'm just wondering what other methods are used / preferred.
I've considered storing each item in their own folder (folder name is the Id in the db) so I can preserve the uploaded file name. I've also considered uploading all media to a locked folder, then using a file handler, which you pass the Id of the file you want to download in the querystring, it would then read the file and send the bytes to the user. This is handy for checking access, and restricting bandwidth for users.
I think the file handler method is a good way to handle files, as long as you know to how make good use of resources on your platform of choice. It is possible to do stupid things like read a 1GB file into memory if you don't know what you are doing.
In terms of storing the files on disk it is a question of how many, what are the access patterns, and what OS/platform you are using. For some people it can even be advantageous to store files in a database.
Creating a separate directory per upload seems like overkill unless you are doing some type of versioning. My personal preference is to rename files that are uploaded and store the original name. When a user downloads I attach the original name again.
Consider a virtual file system such as SolFS. Here's how it can solve your task:
If you have returning visitors, you can have a separate container for each visitors (and name it by visitor login, for example). One of the benefits of this approach is that you can encrypt the container using visitor's password.
If you have many probably one-time visitors, you can have one or several containers with files grouped by date of upload.
Virtual file system lets you keep original filenames either as actual filesnames, or as a metadata for the files being stored.
Next, you can compress the data being stored in the container.

Vb.Net Document Storage

I am attempting to add a document storage module to our AR software.
I will be prompting the user to attach a doc/image to thier account. I will then put a copy of this file into our folder so that we can reference it without having to rely on them keeping the file in its original place. This system is not using a database but instead its using multiple flat files.
I am looking for guidance on how to handle these files once they have attached them to our system.
How should I store these attached files?
I was thinking I could copy the file over to a sub directory then renaming it to a auto-generated number so that we do not have duplicates. The bad thing about this, is the contents of the folder can get rather large.
Anyone have a better way? Should I create directories and store them...?
This system is not using a database but instead its using multiple flat files.
This sounds like a multi-user system. How are you handing concurrent access issues? Your answer to that will greatly influence anything we tell you here.
Since you aren't doing anything special with your other files to handle concurrent access, what I would do is add a new folder under your main data folder specifically for document storage, and write your user files there. Additionally, you need to worry about name collisions. To handle that, I'd name each file there with by appending the date and username to the original file name and taking the md5 or sha1 hash of that string. Then add a file to your other data files to map the hash values to original file names for users.
Given your constraints (and assuming a limited number of total users) I'd also be inclined to go with a "documents" folder -- plus a subfolder for each user. Each file name should include the date to prevent collisions. Over time, you'll have to deal with getting rid of old or outdated files either administratively or with a UI for users. Consider setting a maximum number of files or maximum byte count for each user. You'll also want to handle the files of departed users.