I have code in many Microsoft Access Applications that populate a list with the names of all available printers using code like this:
For Each ptr In Application.Printers
...
While running an application locally, procedures using this code run very quickly.
While running the same application in a Remote Desktop session it takes usually only a few seconds.
For one client, this one line of code takes 90 seconds to execute, but only the first time each day per user, even after the remote desktop session is properly terminated and restarted. The problem then resurfaces for me hours later or the next day.
The Server is Windows Server 2008 R2 Datacenter, SP1
Microsoft Office Professional Plus 2010 14.0.7188.5002
What have I missed?
If you allow the remote RDP session to include your LOCAL printers in that list, then it stands to reason that grabbing the printer list over the network going to be rather slow.
When you launch the RDP client, you can disable this “feature” of the remote system being able to use your LOCAL printer(s) for software running on that remote server. And even worse is your local session might have several printers on YOUR network - this whole process thus can take considerable time - and it can be rather slow.
So disable your local printer use - that option allows the remote server to communicate with and use your local printers - a slow process.
I would suggest you un-check this option when they launch the RDP client:
It turns out that a bad or faulty or somehow uncooperative printer driver set up on the Remote Desktop Server was the culprit. Even when local printers were turned off the 90 second delay was experienced. Removal of the bad printer setup on the server resolved the issue. Using local printers resulted in only a few seconds delay. Everyone's suggestion that the issue might be with Local Printers led to turning them off which then eliminated local printers as the culprit, so thanks all for your input.
Related
So I am connecting to my work computer from home and the Remote Desktop Connection app is annoyingly slow.
I pinged my work pc from my computer and it returned at a reasonable time of 50ms~ with 0 loss. I then attempted to ping my home IP from the RDP session and it timed out every time. Not sure if this might help anyone come to a conclusion but hopefully it does. Note I am also using it in conjunction with Cisco AnyConnect Secure Mobility Client if that helps at all. Work is Windows 7 and Home is Windows 8
I attempted switching off my home pc's firewall but that did nothing.
Any assistance would be great, surely a setting in the RDP file might make it run a little smoother.
I'll edit this post with further attempts at fixes below
Did three things and now RDP is running screaming fast:
Change RDP settings:
Run the RDP session and connect to the remote machine
Find mstcsc.exe in the Task Manager and and set priority to Realtime
I installed Ubuntu server XRDP. Went through Windows and terribly slowed down. I solved this problem. In the /etc/xrdp/xrdp.ini file, change crypt_level=high to crypt_level=None
Our remote chain is Citrix then RDP, target machine is Win 10.
I solved this issue by changing the mouse pointer scheme to None and disabling the pointer shadow.
In Windows 10. Go to Display Settings >> Scale and Layout >> Set the custom scale to 120 [you may need to experiment, try 110 - 150]
After that log in to your Remote Desktop, it should adjust the resolution and scaling factors.
It gave me a faster experience. If you need more then follow the answer of Mr. B
I am developing windows store apps for the surface tablet.
I am remote debugging onto a surface tablet via the local network. At first I had no issues with this, and then occasionally about one out of every four times it would fail to deploy, and I would get the message:
Error: Unable to connect to the Microsoft Visual Studio Remote Debugging Monitor named 'my_debugging_tablet'. The debugger cannot connect to the remote computer. The debugger was unable to resolve the specified computer name.
Initially when this happened, I would simply deploy the project again and the error would not occur again, or, occasionally, I would close and then re-open the Remote Debugging Monitor on the tablet, but generally this would happen seemingly randomly and not re-occur.
However, lately, it has been happening more and more often (with no changes to my code) and now I have been unable to deploy at all, ever, for a couple of days now (and thus I cannot debug on my tablet.)
The same error message listed above is what displays every time I try to deploy or debug.
I verified in project properties that the target device and remote machine name were set correctly, and each time verified that the connection on both the surface tablet and my host computer were fine (my host machine is Windows 8 on Oracle Virtualbox.)
From project properties, if I attempt to manually "Find" the target device (as it does when you deploy back when this used to work) it is unable to locate my tablet (or anything) on my local network. ("Found 0 connections on my subnet")
My MS developer license registration is up to date as well. Additionally, there doesn't seem to be an issue the local network, as both my host machine and the tablet can "see" other things on the network (printers, etc.)
I can't for the life of me figure this out, because, as I mentioned, there have not been any changes to anything such as developer license registration, network status, code, or anything else that should have affected this.
I originally read your question and thought you were saying the two devices could see each other, except through Visual Studio. I was scratching my head at that.
Visual Studio just uses the OS to resolve names and addresses. I recommend troubleshooting the connectivity problems outside of VS, as the problem is larger than just trouble with remote debugging.
Try nbtstat -n to verify you can see what you expect on your network.
Situation:
My organization has "Unified Communication" with Microsoft Lync. I occasionally have to sit at another desk, so I use a remote desktop connection to continue using my usual computer. This means I have to log out of Lync on my usual computer so I can log into Lync at the other one. When I return to my usual computer, I never remember to log back into Lync.
Question:
Is there a way I can automate the process? Like, can I do something so that every time I start a remote session with my usual computer, it automatically exits Lync, and every time I end a remote session it automatically starts Lync up again?
My experience is limited to simple batch files and Visual Basic, but I'm pretty good at learning just enough of something to do simple tasks.
Any input is appreciated. Thank you.
Lync will let you log in to multiple devices simultaneously, or end-points as Microsoft calls them. If your usual computer is locked, that device should "detect" you as away. If you are logged in to another computer, THAT one will display your presence as Available, unless you manually change your status.
There is no reason to log off of the usual computer.
Do you really need to log out of Lync at your usual computer? Lync lets you sign in to multiple devices, and will generally do the right thing in ensuring you receive any IMs/calls.
So when you switch machines, you could just sign in on the new machine, leaving the old logged in.
Or is there a specific reason why this doesn't work for you?
This is quite odd.
I have a windows service that works OK, but when the computer is restarted the service get stuck on a call to WebRequest.GetSystemWebProxy for 11 minutes. About 11 minutes, on XP and on Vista, where it was tested so far, all the time. If the computer is connected to the company's domain, it works. But when it's not, I'm facing this issue.
Some other weird things happens on Vista as well. For instance, it doesn't have an active network connection (disconnected) during this time and some other Windows system warnings appears just after these 11 minutes. Before 11 minutes the system seems more or less hanged, waiting for something.
It's not about my machine, because I've tested it on XP, and Vista, and also on some VMs of them.
I'm pretty sure that the call to WebRequest.GetSystemWebProxy in your case actually tries to connect to some system on your company network to
Actually find out if the Proxy server is found
Get the script that automatically configures these settings on your system
Please try to use a tool like Fiddler or WireShark to see which outgoing connections are attempted. The url will point out the location it tries to connect to and the reason for the call.
Can someone explain what is the difference between X server and Remote Terminal servers in simple terms?
For example, Hummingbird Exceed is an X server and Citrix is a Remote Terminal Server. How do these servers work?
A terminal server runs at the "other" machine while you use a remote desktop client to view the other machine's screen.
A X server (of the X11 Window System) runs on your machine while another machine (or several thereof) send their output to your computer.
The most important difference to the end user is probably "culture": With the X Window system you typically work with windows that run on several hosts. (You often sit in front of a quite stripped down workstation, get one application from one computer, another one from another computer.) When working with X things feel very heterogeneous - a special application only runs on a HP workstation while your company is stuffed with suns or linux boxes? No problem, just buy one HP, everone can use that application over the network like as it was local.)
Remote terminal services feel more like another computer sends its complete screen to you, more like you have a 100-Mile-Long monitor and usb cable (with a little lag built in). You typically use a remote desktop client that sends a complete desktop to you.
However in recent times both techniques get close to another - windows remote desktop (which is based on citrix) can send only application windows to your desktop, while a lot of programs based on X11 are theoretically network transparent but practically need to run on the local machine. (Sorry, no 3D shooter over the network - an extreme example).
Which one is better? I don't dare to say. White X11 is a lot more flexible (it was designed with network transparency in mind - it makes absolutely no difference if an application runs local or remote - it is in many aspects more complicated. As long as there was no remote desktop sharing there was a clear advantage, but slowly the gap is closing, for example by terminal services now allowing you to do many things that were available with X11 only in earlier times.)
By the way, the main reason many X11 application still feel a little "snappier" over the network than windows counterparts is the thing that many application programmers on windows still think they always run local and dump a lot of bitmap graphics on the screen - like custom toolbars in ZIP tools. X11 applications did not do this for a long time and chose "ugly but fast" because X11 forces you to think about the network. But as X11 applications get more pretty and Windows programmers more aware about terminal services the difference will dwindle.
Oh and an important point: X11 is deeply ingrained in the Unix way of things, Citrix is mainly used on Windows (in the form of Microsoft's Windows Terminal Services - which originated in Citrix code). So lock a terminal services admin and a X11 operator into a cage and step back watching bloodshed when they figure out who they are locked in with ...
An X server most likely refers to the X11 windowing system, which is the GUI that most unix flavors (including linux) use. It's a client/server setup, and has been around for a very long time
A remote Terminal Server in the case of Citrix is a remote windows instance that can be connected to with a special Citrix client. The Citrix environments I'm familiar with are all MS Windows solutions, ie they work similar to X, but are for Windows Servers only
They both sort of operate in similar fashions, which is serving a remote client a windowing solution. IE, they both let a server run the actual application while the display of that application is sent back over the network to a client PC.
A 'Terminal Server', as it's called, basically allow you to connect to a Windows session remotely. They employ a bit of magic to make the experience snappy over connections with latency. The Windows GUI system isn't network transparent like X, so it took a while longer to get this feature. Windows Server 2008 and Citrix products have the ability to let you use a single application, unlike the traditional Terminal Server.
X is the GUI protocol for Unix/Linux. The X server accepts connections and displays their windows. The clients are actually the programs themselves. These clients can be local or remote, it doesn't matter to X. X just displays them as requested, on the local screen or over a TCP connection. This is lower level stuff than terminal servers, and allows graphical programs to run on one machine and display on another. X11 doesn't compress or encrypt the traffic like RDP does (although SSH can help you out there).
The linux equivalent of RDP is NX. They provide free software to run NX servers/clients. I've used it and it works pretty well.