Opening a project in an IDE / editor over Samba = SLOW - ide

I'm not sure if this is the correct forum to ask this question so I'll probably be told about it, but anyway - I'm connected to the Samba share on my companies development server from my home (where I work now), and when viewing the files through explorer (windows 7) the browsing is relatively quick. However when I open a directory on the Samba drive as a project in an IDE - whether it be Aptana or eTextEditor - browsing the directories in the project panel is unbearably slow.
Any ideas?
Thanks in advance.

We did extensive trials with the source stored on a smb (cifs) mounted disk in the enterprise software company where I work and the conclusion was that it is not possible to tweak this to any acceptable performance, since cifs performance for handling big amounts of small files is so poor.
In our scenario the terminal server using with the IDE was in the same data center as the app servers serving the cifs so the network was comparable performing as local disks.
We also invested some time trying this out with NFS on Windows but the performance was just slightly better there. To compare we set up the same scenario with NFS and Linux and it turned out to rather okay.
The difference between Explorer and a IDE is that Explorer just bothers about a directory/file a time, while you IDE will access all you files allot.
The way to go is probably to go with a VCS and a local install of the IDE at home or a remote desktop solution.

Related

Run IntelliJ in client-server mode

Currently, IntelliJ IDEA does not have a "Remote Development" feature.
Lets say I have two machines: Machine 1 (very good configuration ex 64GB ram with Intel Xeon processors) and Machine 2 (Macbook Pro with 8GB RAM).
Lets say I have IntelliJ IDEA installed on both machines. The problem now is, there is no client-server mode for the IDE. The closes thing I have is to use OpenNX.
What I'm looking for is a plugin/feature that enable remote development. What I mean by this is: On my macbook, I should be able to add Machine 1 as a "server". And once that is done, the IntelliJ IDEA on my macbook will only act as a client for the IntelliJ IDEA on my Linux box. Basically it would be replicating the UI. However, the catch here is that, it shouldn't do so by sending images (the way any VNC or NX client would). Instead, since it is for a specific application most of the data can be managed through text data only.
Since OpenNX uses images, even with compression it wouldn't match up the performance of text only transmission.
Basically I'm looking for IDEA on one machine to be a client (Remote GUI) for IDEA on another machine.
UPDATE
The eventual answer is: This is not possible (As of now). While I was aware of other options, that wasn't what I really wanted. However, it appears there is no such option.
The main reason why I wanted the option was because my desktop (remote Linux box) has a much higher configuration (Intel Xeon 2GHz processor and 64GB RAM) and my client was an Macbook Pro with Intel Core i7 and 8GB RAM. (By no means any less). However, due to the size of my codebase etc, the indexing of the code etc by the IDE slows it down.
Both client and server are perfectly capable of running an IDE by themselves. However, due the size of the code base it would be better to have the build of the work being done by the IDE on the server and the client being just the front end to it.
The other solutions like VNC, Nomachine - OpenNx all use image compression. And when your client is a Mac, you run into keyboard mapping problems. A client-server mode in the IDE itself would use text compression instead and would be much faster. It would also solve the keyboard mapping problems.
While to me, it sounds like a good idea, it probably doesn't get used by enough people for it to be a feature of the IDE.
Note: I would also be open to considering Eclipse as the IDE if this feature is available. Any answers will always be apreciated.
You'd probably be better off switching instead to a remote code repository that you keep in sync. While the concept of doing this in an IDE plugin is interesting, it has some fundamental flaws. What happens when the machines can't talk to each other? Are you unable to work at that point, or can you work offline. If you work offline on both machines, how do you reconcile changes...
I suggest looking into using "git". You can set up a remote repository very easily. If you have ssh access to either machine or some other shared machine, you can create a remote repository on that machine, and your "client" machines can easily push files/changes around.
There are plenty of other code repository options, but I've found git the easiest to set up.

Sandbox a java applet

I heard that minecraft server is very leaky, can consume a lot of resources very quickly. People say to use a virtual machine, all well and good. I'm making an application to automate server setup, and I'd like my whole application (including minecraft) to run in an ultra basic auto setup vm (or something similar). I've heard of mineos, but I'm not sure if that can be set up very quickly. The vm will be so basic it won't even have a ui. I'm using a Mac, not planning to distribute the server WITH the application but have it download from the minecraft server, not modified.
I want it to be like a one-click-done solution for the end user, they don't have to worry about minecraft server gobbling up resources because it's be in a controllable virtual machine.
Distrubuting minecraft server (Notch's property) could be an issue, but if anyone knows about that if be happy to hear.
If you intend for a server to be fully configured and only for your user to only have to download and 'open' it, what you're seeking is known as an 'appliance'. Virtualbox supports the open-standard of such appliances, allowing a single file to be distributed and it contains all the virtualized hardware info as well as the OS/filesystem. A number of other formats exist, such as Turnkey.
In all likelihood, I would find MineOS CRUX to be perfectly suited for this sort of one-click-done, since the OS was designed for pretty much exactly what you're trying to do...only without the configure-the-hardware-for-the-user (it uses an ISO and an installer, the process you would automate for the end-user).
That said, this distribution has never at any point packaged Minecraft files, as clearly stated: "this Linux distro does not contain ANY Minecraft files. The scripts are, however, designed to download/update files directly from the source: http://minecraft.net"
Hope this answers all the concerns, despite being an old thread.

Virtual PC 2007 as programming environment

I'd like to create a VM in Virtual PC 2007 for use as a development environment/sandbox for an existing ASP.NET application in Visual Studio 2005/SQL Server 2005 (and VSS for source control).
I'm thinking that I need to create a 'base' copy of the environment (with the os, Visual Studio, and Sql Server), and then copy that to a 'work' version that I do actual development in. I would be sharing this VM with one or two other developers who would be working on different parts of the app.
Is this a good idea? What is the best way to get my app/databases in and out of the VM and the changes I make into VSS? Is it just a copy from the host location to the VM share and back again? How do I keep everything synchronized?
Thanks!
I would seriously suggest you the following things:
Use a "server" solution, rather than a desktop solution. That's far more reasonable if you want to share the VM environment with other developers.
Use VMware's products rather than Microsoft's.
From these two points it follows that you should use VMware ESX Server and related products. If you don't want to / can't invest money in it there's a free version of this product: http://www.vmware.com/go/getesxi/, but I never used it.
Whether you choose to use the enterprise version of ESX server or the free version, I suggest you put your IT organization's IT department on it.
It's not a bad idea, if you think there's a need for it.
I do something similar when I need to develop a Windows App because it's just nice to have a clean environment. That way I don't accidentally add a reference to something that's not necessarily included in the .NET Framework. It forces me to install any 3rd party components as I'm developing and documenting. This way, I can anticipate prerequisites, and ensure that I have them documented before I load software to a user's PC and wonder why it doesn't work.
Just make sure the PC it's hosted on can handle the additional load. My main Dev PC is a dual core processor with 4GB RAM. I devote 2GB to any virtual PC I plan on using as a development environment so that I don't hit too much of a performance snag.
As for keeping everything synchronized, you will want to use some sort of source control (as you should even in a normal environment). (I like SVN with Tortoise SVN as my client of choice, but there are plenty of alternatives.) Just treat the virtual PCS as if they were normal PCs. Make sure they can access the network, so you can all access your source code repository.
You can use the snapshot feature (or whatever it is called) - that chagnes to the "system" are saved to a delta file so that you can easily revert to an earlier state of the virtual pc. It has some performance penalty. This way you don't have to keep base and work copies.
I use Virtual PCs for all of my Windows development. The company I work for has legacy products in FoxPro and current products in .NET so I have 2 environments set up:
1 - Windows XP with Foxpro and VSS - I can access VSS directly from this image and the code never enters other machines in my network (I work remotely)
2 - Windows 7 with VS2008 and all the associated bits and pieces needed to develop our .NET software (including TFS). This is the machine I use every day - I have a meaty desktop PC so I I am able to give the VPC 4GB RAM and runs as fast as a 'normal' PC.
I have my VPCs running in VirtualBox and it is equally as good as the other offerings. A previous answer mentioned VMWare ESX which is an excellent product for large scale deployment but if you want a server solution then VMWare Server is free and is a nice virtualisation platform.
If you are looking at ways to experiment with changes and still want to use VPC then undo disks are excellent - you fire up the machine, hack away to your hearts content and when you shut down you can choose to save or discard the entire session.
For me Virtual PCs are an excellent way to quickly set-up / tear down development environments and I would struggle to return to using a single machine for all my work.

Pros and Cons of Developing on a VM on a PC

I recently build myself a semi beef up PC (Q9450, 8GB DDR2 1066, 1TB HDD, Dual 8600GT, Vista Ultimate and Dual 22' Monitors) and I'm evaluating whether i should develop on a VPC/VMWare session on top of Vista or not?
One benefit I can see is that I can run the same VM on my Vista laptop so my development environment is the same on any of my machines. I also plan on purchasing a MBP before the end of the year as well.
Found a couple of articles online that semi-help Here
Any other thoughts would be really appreciated?
For webdevelopment I like to have the serverpart separeted out into a VM. My current setup is a Macbook Pro with several Debian VM's inside. I like the isolation aspect of it. I can try new software on the servers and have the ability to revert them back if something is messed up.
I do the programming via network-share (samba) in Textmate on the host system.
Another advantage of a VM is having a clean installed base. I use my desktop and laptop for lots of things aside from development. You never know when a piece of software you install is going to conflict, or if the little tweaks and what not you play around with are going to trash your OS. Reinstalling/configuring all your tools so they are exactly the way you want them can take quite some time. If you have a backup of your Development VM Image you can mess up your PC as much as you want but still be able to code without downtime. It also allows you to run Win/Visual Studio/Etc on a box that you would otherwise prefer Linux or MacOS on.
You can also make multiple copies of the same Image and use each one for a separate project.
Being able to transition between a laptop/desktop/server/remote connection, and always be in the same environment is also very helpful.
One problem I found (at least when using VMWare Server) is that no matter how fast your machine is, the screen refresh rate is still around ~30hz. That makes for a slightly unpleasant experience after using it for a while.
Where I'm working at now I use a VM for all of my development because I don't have admin rights to my base copy of XP.
Pros:
I like using a VM's because it give you some flexibility - you can switch between machines - have programs running on both and have a cool environment to work on.
Cons:
You have to boot up multiple operating systems. This takes time, memory and resources.
Clipboard operations on VM's can be interesting at times. Sometimes copying to clipboard does not work or gets mixed up between VM's. (Using VMWare).
File operations can be interesting when you plug in USB drives and other external devices. VM's sometimes do not see the devices, sometimes it does.
If your VM image become corrupt - you can easily loose everything in it.... unless it is backed up.....
It's great for presenting development talks, you can revert to a snapshot and give the talk from the exact same starting point each time.
Bulk-up your RAM on your future MacBookPro if VMWare will be used. I haven't (yet) and the performance with several other (mac-side) apps open really starts to feel sluggish.
All the best.
I was doing some work with Visual Studio recently with a Windows XP vm on Linux and somehow the guys who made the vm (vmware) made the windows machine actually run faster. We did some time tests to make sure and it wasn't major, but a few things (autocomplete for example) really did pop up faster.
If you are on Windows, Virtual PC is pretty decent for development work. VMWare Virtual Server is not really designed for use as a desktop and you will get very tired of it with any prolonged use. Sun's VirtualBox is another option competing with Virtual PC. VMWare has a workstation product but it is not free.
Typically, I do development on the real desktop (non-virtual) and then deploy or test to virtual machines which I can snapshot and roll back easily.
For a long time, we were developing on very early versions of Visual Studio 2005 and the associated .Net bits that went along with it. To protect our real machines from the various problems associated with pre-release software, we did all of our development work inside virtual machines. It worked amazingly well. I've been considering moving back to that model as it makes upgrading the physical hardware a snap (not to mention making it easier to deal with hardware failures by just replacing the entire machine): you just copy the VM image over.
On my current machine (A Core2Duo with 4GB of RAM), the performance drop when running one VM is almost not noticeable. Running two VMs, however, is painful.
I also can't figure out how to get VMWare Server to work across two monitors well.
I wouldnt want to develop in a VM so much as test things in a VM. For instance, it might be nice to set up a couple VM's to emulate an n-tier architecture, or a client-server setup or finally simply to test code on multiple OSs
It depends what you are developing and in what language.
VM's tend to take a fairly hard hit on disk access, so compiling may slow down significantly, especially for large C/C++ projects. Not sure if this would be such an issue with .NET/Java.
If you are doing anything that is graphics intensive (3D, video, etc) then I would steer clear of a VM too.
I don't know if it is so useful as a development platform unless you are doing something that ties into software you don't want to have installed on your regular working machine or that needs to work around a certain event that you need to be able to reset on a regular basis. It can also be handy when you are working with code that risks crashing your computer as it will at least only crash your VM.
It is brilliant for testing different configurations and setups- working with installers and so on, that is where virtualisation really shines as far as I am concerned, being able to roll things back whenever you need to and run through stuff repeatedy is amazingly useful for identifying problems before your end users run into them.
While doing development at home, I have to VPN into my company to be able to use the collaborative tools that are on the intranet. I also have a desktop + laptop that are hooked together through Synergy.
The problem that I have is that our VPN software wants things to be so secure that it will force all network routing through the VPN gateway -- even if I'm using additional NICs to network my desktop and laptop through a separate private network. The end result is that I can't use Synergy between my desktop and laptop and VPN into my company at the same time.
The solution suggested to me by a co-worker was to setup a VM instance on my desktop and use that for all my VPN needs. Works like a charm!
Speaking from personal experience developing java in an Ubuntu VM on Windows 7, I've found this to be quite productive. Mainly because my local IT support on the ground supports Windows 7, so I can do things like access all the local file shares and printers in Windows, and then config my Ubuntu VM to my heart's content.
Huge productivity benefits around remote access and desktop sharing. Windows allowed me to very quickly and easily use tools like logmein.com and join.me to access my machine from home and to desktop share the VM with other people in the company (both work seamlessly with the VM in a nearly full screen window). Neither of these services are supported on Linux, and I wouldn't want to deal with all the associated VNC/X setup and network config on Ubuntu.
My machine is fairly beefy. Quad core, with 16Gb RAM - 8Gb for the VM. Java dev in the VM is pretty quick.

Any Tips for Doing *All* Your Work in a Single Virtual Machine?

I bought a new Vista PC recently but was having lots of problems getting everything to work on it, so I continued doing most of my work (development and other) on a slow XP machine that I've had for years.
Until now, that is - I used VMware Convertor to take an image of my old XP machine, and now I'm running it on my Vista machine, and doing pretty much all my work within that XP virtual machine. I'm using VMware Worstation.
So each morning I boot up my Vista machine, and then I boot up my XP virtual machine and spend the whole day working in the XP virtual machine.
Yes, you can probably guess: I'm the complete opposite of a VMware power user... I've not figured out snapshots, linked clones, or anything more than the absolute basics of running a VM. But I set this system up OK, and it's working well. Everything's running a lot faster than it was on my old machine anyway.
However, I'm concerned about the VM getting corrupted or something and causing me to lose everything. Of course I can back the whole VM up, and I can back up files from on the VM, and I will, but I'm wondering if it might be easier and safer to use a mapped drive or public folder or something for all my work, so that if the XP VM goes kaput, my files will all be available from the Vista machine.
This would also be good because I could share files easily between the Vista and the XP machine (I do use Vista for the odd thing). But I'm wondering if it'll make it much slower to read and write files from my XP machine? (e.g. if I'm compiling a big Java project, which will involve lots of IO at once.)
The information on how to set these things up is readily available, but I haven't found it so easy to figure out the best approach for what I'm doing. Most people are using VMs for much more advanced purposes than mine.
Also I'm wondering if there are any other tips or important considerations for this doing-all-your-work-in-one-VM type of setup? e.g. what's likely to go wrong, and how can I avoid it? Anything else?
I have an Ubuntu Linux box at home which has three VMs, all totally self-contained.
The first is for my wife's business, she needs access to all the MS Office stuff and MYOB.
The second is for work, they're too tight to buy me a laptop and I'm not going to let them install their hideous security and auto-update products on my real box.
The third is my Visual Studio development VM.
It runs like a dream (although only ever tested one VM at a time). And I just backup all the VM files from Ubuntu (along with my Linux work as well) which basically gives me images of the VM hard drives.
Surely if you are doing all your work in a VM, it's time to think about changing your host machine to one that's usable, no?
As others have pointed out, it is time to think about changing your host OS to one you are comfortable with and can get your work done on. Depending on what you do on a day to day basis on your machine, I can bet Vista is going to be anything more than a big hurdle. Why tax your work and yourself by running VMware on top of a beast that Vista is only to do all your work inside the VMware?
Having said that, I do suggest that you look into VMware snapshots and cloning. Those two are powerful features, not least the former in your case, which can be used to avert, in addition to solving, a lot of common problems you can run into while running any OS inside a VMware.
I perform a crude backup once in a while where I compress the VMware image on disk with toolsk like 7-zip, and store them on backup media. However, for backups or restore points within the system, VMware's Linked Cloning is definitely a handy feature -- since Windows is susceptible to getting corrupt/infected often, with linked cloning, you can be pretty sure that you can easily revert back to the last state before the corruption took place, and continue your work unimpeded from there.
I have been using VMWare at work for a couple of years now. I use it for development and testing. As long as your base PC is good enough it is a really good way to separate your "PC Life".
I would certainly be storing your data files on a server somewhere. This can be either a mapped drive, source control, or whatever. When you start using snapshots it is really easy to wipe a session, so treating your base PC as a kind of NAS avoids this problem.
I have now decided to start using VMWare at home. I have a VM for business apps (Office, QuickBooks etc), one for Visual Studio development and several others for web servers, sql servers etc. My base PC has 8GB RAM & a 2.8GHz quad core processor, so running four or more VMs is no problem.
I'm wondering if it might be easier and safer to use a mapped drive or public folder or something for all my work
Please please please, use a version control system (that is also backed up) if you're working mainly with text files. A mapped drive or public folder is accessible, but not the best way.