In Ektron, Load Last Active Location - ektron

Question:
Anyone know what setting in my user profile will make Ektron always load the last active work area location? If not, is there a way to load a specific folder every time?
Already Tried:
"Set smart desktop as the start location in the workarea." doesn't seem to do anything.
Why:
I'm primarily a designer, so I'm usually just replacing files in the library and leaving the content area to the developers. It's kind of annoying that the content area always loads first because the folder structure looks the same and many times I actually navigate to the content folder instead of the library folder. This is a waste of time because Ektron is so slow. It would be really helpful to load the last active location in the area or at least the library files first.
Thanks!

The short answer is no. At least, not in any way that would be supported by Ektron or be upgrade-safe (upgrades would likely destroy changes made to include this functionality).
The long answer is that the Workarea code source is available and a .NET developer who wanted to parse through it and figure it out would probably be able to do so. It would require adding a user property / cookie to store their last activity (at the desired level of specificity) and then update any and all related code files that would automatically take them to any given recorded location. It would be inadvisable due to the effort required and for the upgrade-related reasoning above.
THAT BEING SAID - Ektron's Workarea uses Frames and you may be able to bookmark one of those internal frames or else create your own set of frames in your own HTML that would load up the view that you want. Depends on whether it's important enough to you to put in the effort to figure those things out by inspecting enough of the client-side code.

Related

Sketch with Dropbox and multiple users

I am using Bohemian Coding Sketch to design websites. All my files are in Dropbox, shared with a team (another designer). Most of the time we are working together with same files — one is editing, another is watching and discussing. I think this is a pretty common scenario these days.
When files are changed by the other, they get changed on my disk by Dropbox. And after that things go worse. Sketch gives this warning:
Any choice I make is bad, because:
Revert changes. It does not mean "revert to a file on disk". It actually means "revert to a file state, that was when you last opened the file".
Save. Means "overwrite with your changes work from other designer".
Cancel. Means "Do nothing"
Since this dialog opens when I close the Sketch, I have no option, but to shoot myself in a foot.
Does someone have a solution? One proposed is to copy files from shared folders to view them, which works but smells funky...
Ok. So this is a workflow issue. Sketch doesn't have or offer the ability of multiple user versioning. So it's a highly bad idea for more than one person to be working on the same file.
You're two choices are...
Have the other artist create a duplicate file. Not only does this insure proper versioning (something you guys should be keeping track of), but it also allows for the lead to then take the best ideas from each and combine them into a new versioned file.
Purchase an asset management system like AlienBrain. It handles a lot of the tedious processes of versioning for multiple artists in a studio. However I'm not aware of its integration with DropBox.

SharePoint groups and shared libraries/lists

This is going to be vague, hopefully not annoyingly so. I know very little about SharePoint, but I'm asking for someone who's more knowledgable but is under lots of crippling pressure. Unfortunately I'm going to be held responsible for the project (it's due before Christmas!!), so I need to see what I can figure out on my own to help out. Please allow my desperation and helplessness to excuse any problems with this question.
We've created an InfoPath form that generates xml files that will be uploaded to SharePoint. The data from these files will be aggregated and used to generate reports. The biggest issue is that the users will be spread out over three locations, and the info generated from each location needs to be firewalled from the others. But we need the xml files from all three locations to go to the same place in order to make the aggregation feasible with minimal manual work.
I've read something about SharePoint groups (http://technet.microsoft.com/en-us/library/cc262778%28v=office.14%29.aspx) and figured that might be the way of doing it, so long as 1) the xml documents could all go to the same library/repository and 2) that shared repository would only show each group their own documents. For at least two users we also need a master view that shows all of the documents regardless of the group that created them.
That's the main question. Ultimately we'll also need a similar way of storing the generated reports (tables and charts) to the creators of the xml files AND a set of users at each location who won't be able to view or create those xml files. But first things first, I guess.
Is this possible and feasible? Any hints/links that could get us started down this path?
I think in your case the best option is to create a folder for each group, and set permissions on them to allow just the specific group of users to access that folder. The same with a separate library for reports. Then, you'd just setup a list view that flattens the folder hierarchy to view all items at once.
You could also set per-document permission programmatically in an event receiver, however, there's a pretty low limit (search for ACL) on the number of unique access control lists per library (it's 50.000 actually). So depending on the number of XMLs you are going to manage you may reach this limit.

How to compare test website and live website

We have our production server running our website. Then we have a test server which has exact same data but with changes to code to do some new functionality. This web app has over 500 pages.
Is there any program that can
Login to the test site
Crawl through each page and then save the page as html
Compare with the same page saved with live site?
This way we can make sure that new features that we add to our test site will not break the live site when code updates are applied to production.
I am currently trying to use WinHTTrack website copier and then comparing the test and live folders with some code comparison tool like beyond compare. This works ok but there are lot of files changed because of the domain name changes.
Looking forward to ideas / solutions for this problem.
Regards
Have you looked at using Watir for this? It's not exactly the thing you are looking for but it might allow you some more granularity in your tests and ensure the site is functionally identical rather than getting caught up on changing guids, timestamps and all the other things that tend to change across any significant size website from day to day as part of it's standard functionality.
Apparently you can't make consistent, reproduceable builds in your project, can you? I would recommend moving towards that in the long run, it will save you a lot of headaches. That way you would know exactly what was deployed to which server when, so there would be no more need to bend around backwards to get the deployed sources back like this...
I know this is not a direct solution to your problem... but maybe it is worth comparing, whether you would save more in the long run by investing the efforts into your build process now, instead of implementing this workaround (and then improving your build process anyway - because one day you will almost surely need to do that).
wget has a --convert-links option, there are also some options to preserve cookies that might let you do it logged in http://drupal.org/node/118759#comment-664498
use an Offline Downloader, download all files to your computer from both sources, then compare the folder contents using a free tool like Total Commander.
EDIT
Load both of your sources into a CVS, and compare it there.

Drag & Drop from Form to Windows; get drop destination

I have been developing an app in VB.NET which requires a control object (for example, a ListViewItem) to be dragged out of the form, and to a user-specified location (for example, on the desktop, or in a folder).
However, The file that is intended to be 'copied', as the 'ListViewItem' represents, does not yet exist. It needs to be downloaded and then placed in the user specified location. Am I able to get the path/location of the destination drop? I would then proceed to download the file, and then place it where the use specified.
I have looked at other questions regarding a similar issue, which details the dragging operation outside the form, its just there doesn't appear to be a way to determine where that short cut went or how to flag the destination location.
Essentially, I am thinking that it may require some sort of 'dynamic link' or 'virtual file' as I've seen mentioned elsewhere. Then, after the drop operation, somehow accessing this 'link' from my application, proceed to download the file and place it in the final drop destination.
Any help is appreciated, thanks in advance!
OUTCOME:
Roger Lipscombe provided a link that contained links to other articles, with what looks to be promising information.
The following links may prove useful in implementing a drag drop operation without providing the exact data that is required in managed code.
Delay's Blog; Creating something from nothing
Delay's Blog; Creating something from nothing, asynchronously
You can ask Explorer to delay an IDataObject::GetData call to CFSTR_FILEDESCRIPTOR to when the drop actually occurs by responding CFSTR_PREFERREDDROPEFFECT in your IDataObject::GetData implementation. See http://hg.mozilla.org/mozilla-central/file/b49a6a8a4973/widget/src/windows/nsDataObj.cpp for an example. Note if the target is a virtual folder, the drop target is not obligated to honor your preference.
Explorer check clipboard formats for file name in the folling order
CF_HDROP
CFSTR_FILEDESCRIPTOR/CFSTR_FILECONTENTS
CFSTR_FILENAME
Do not use CF_HDROP because it requires that the source file(s) actually exist somewhere in the file system. Use CFSTR_FILEDESCRIPTOR/CFSTR_FILECONTENTS instead.
Do you really want to know where the "file" was dropped? Or do you just not want to provide the data up front?
If the latter, Raymond Chen has a whole series on implementing virtual drag and drop, in native code. David Anson translates it into managed code and adds asynchronous support.
I'm sorry, but there is no way to get the target path of a DnD operation. Because the drop target may not even have a path!
See here for a more detailed explanation.
Of course, you could try to hook into the DnD, then ask for the target window and from there try to find the target path if the window is known to you (e.g., the window belongs to the explorer process).

browser plugin to test a site's look when migrating

I'm thinking I need a browser plugin that does the following, and if it doesn't exist, it should. I may as well say FF for now, but it could be any browser.
The problem: when moving a website from one server to another, you need migration testing. It is a pain to click on every link by hand and compare it to the old host. You really need 2 machines or have to constantly thrash your hosts file.
The plugin:
Would allow you to specify an alternate hosts entry for a website. 2 entries would make it clear, one for live, one for test.
The plugin would crawl every link on the site, and render the page in the browser, and save an image of the entire page.
It would switch hosts and repeat, and save images in a second folder. Since the rendering engines match, the images should match. We need to switch hosts (like /etc/hosts) so all absolute links are the same for the site.
Now this could be part of the plugin or external, now that we have 2 folders of identically named images, we run an image-diff program on the whole batch. A quick test would be a bdiff or hash, or we could get more sophisticated and determine how different each image is.
This would save so much time. So can it be done with existing tools, or do I need to go write it?
Have a look at Selenium, it allows you to script interactions with the browser and verify content.
That is overengineered. What kind of website is it? How big? Which framework (PHP, JSP, Rails, etc.)? Why not copy the website onto the new server and grep the code for specific ties to the old server?
I'd concentrate on why you think the site would differ between two servers, and focus on testing those specific cases rather than the whole site. When a site is moved to a new machine the issues are generally very obvious from looking at a couple of pages.
Presumably they are both looking at the same data source, assuming there is a data source, otherwise a folder diff on the two installations would suffice. This being the case, it should be a simple task to identify which areas of the site are likely to be affected by a server migration.
Also, I wouldn't personally trust a machine matching two images to sign off system as ready to go live. There just isn't a substitute for real human testing. Yes it's time consuming, but how important is your site?
Try http://www.browsercam.com/ - free trial should allow you to specify main page and follow links to make screenshots automatically of the sub-pages as well.