exporting rdlc report to excel or pdf making server cpu to go high - rdlc

I have been having an issue with report viewer, for one of the reports,when we are exporting large report to excel or pdf, it opens up in a new window and process forever and it also causing server CPU utilization to 100%. Does anyone have come across this kind of issue?
Your help is greatly appreciated.

Related

Pentaho text file input step crashing (out of memory)

I am using Pentaho for reading a very large file. 11GB.
The process is sometime crashing with out of memory exception, and sometimes it will just say process killed.
I am running the job on a machine with 12GB, and giving the process 8 GB.
Is there a way to run the Text File Input step with some configuration to use less memory? maybe use the disk more?
Thanks!
Open up spoon.sh/bat or pan/kettle .sh or .bat and change the -Xmx figure. Search for JAVAMAXMEM Even though you have spare memory unless java is allowed to use it it wont work. although to be fair in your example above i can't really see why/how it would be consuming much memory anyway!

VBA speeding up reading csvs from large non local folder

I was looking at a macro that imports several csvs from a fileserver. Running the macros takes a few seconds (20ish) to initialize before the first csv gets imported. the imports themselves happen fairly quick. If I run the amcro a second time, ther eis no delay.
When I manually open the folder on the file server with explorer it also takes quite a while (30 secs or so) until all the files are shown, so I assume the macro also has to wait until the relevant files are loaded. So, my question: Is there a way to have excel automatically index that folder to be able to open it quicker or can I already run a process in the background when opening the excel file that would read out the folder?
Cheers,
CE
Edit: I can not archive the folder and make it slimmer
The file might be cached in memory, thereby avoiding lengthy disk I/O. You need to monitor your machine activity in terms of CPU, I/O and Network activity to figure out where the time is spent. Launch perfmon.msc and add the relevant counters to do so.

ImageResizer crashing on large images

Im using the awesome ImageResizing component and am experiencing an "Out of memory" error when trying to upload and read images that are about 100MB in size. It may seem large, but we're a printers so many people do need to provide images of that size.
The line of code that fails is:
ImageResizer.ImageBuilder.Current.Build(Server.MapPath(strImagePath), Server.MapPath(strThumbPath), new ResizeSettings("maxheight=" + "150"+ "&maxwidth=" + "238"));
This is probably the GDI itself failing, but is there any workaround other than detecting the error occured and letting the user know?
Thanks in advance
Al
A 100MB jpeg generally decompresses to around 8 gigabytes in bitmap form. Your only chance of getting that to work is getting 16 GB of RAM and running the process in 64-bit mode.
Alternatively, you could try libvips - it's designed for gigantic image files. There's no .NET wrapper yet, but I really want to make one and get some ImageResizer integration going! Of course, without anyone interested in funding that, it probably won't happen for a while....
As mentioned by Lilith River, libvips is capable of resizing large images with low memory needs. Fortunately, there is now a full libvips binding for .NET available: https://github.com/kleisauke/net-vips/.
It should be able to process 100MB jpeg files without any problems.

Excel VBA "Out of Memory" debugging techniques

I was debugging a problem mentioned in a few other* questions on SO and noticed a strange behavior during the debugging process.
The behavior:
Experienced 'out of memory' error while pasting complex formulas. Only about half of the 20,000 rows I'm iterating get formulas pasted before the error.
Commented out virtually all code, error goes away.
Uncomment code incrementally in the hopes of discovering the specific section of code that's causing it.
End up uncommenting all code and stop experiencing the bug!
This means the exact same code worked fine in the same Excel instance, and fixing it only required running various lighter versions of the code before going back to the original version. What could possibly cause this?
Assuming the data you were running on was exactly identical every time, it sounds more like your problem was with the environment - the problem might be that the operating system ran out of memory. In Excel 2007, usable memory for formulas and pivot caches was increased to 2 gigabytes (GB), so that is probably not the issue. However it is of course also limited by how much memory your operating system had available at that time.
The problem may have occurred because when you first tested it, your available operating system memory was lower (from other processes running... could even have been pushed over the limit by background programs such as Antivirus software running a scan) than when you ran the full macro later. I would try running your macro with the Task Manager open to see if you are getting anywhere close to low on physical memory. Also, (assuming you are on Excel 2007 or later) look at how much memory Excel is using and see if you are getting anywhere close to the 2GB limit. I doubt that this would be the issue, but it's at least worth double checking. Also, like Zairja said, make sure you're setting the calculation to manual at the beginning.
You said that you were using complex formulas... check out this article on Improving Performance in Excel
There is a lot of useful information in the article that will probably help you streamline your macro.
Is this helpful to you?

Digital Western Hdrive freezing - Bad hard drive

A week ago my computer start freezing every couple of seconds to 30sec-2minutes.
So i open my proccess explorer to monitor it to see if i get some CPU spikes and if so, which application is causing it.. after some freezes i noticed non of my programs/services is causing the freezes.
so i tried to check if any of my fans aren't working.. but all fans are working great.
adventually i ran the chkdsk scan (in the way i had tons of crashes/ startup problems/ i even couldnt run the windows installation disk due to a memory diagnostic problems.. I HAD Really lots of lots of problems)
adventually i found the problem, it's appear my DW hard drive is faulty and here the hard drive results:
http://pastie.org/2949300
now i'm searching the web for a tool that could fix all it's problems because i really need the drive to work.
Windows 7 ultimate 64bit.
intel e6320
4gb ddr2
ati hd5450.
Please help me if you can guide me what can i do to fix it.. (my os is on it)
Buy a new hard drive, install windows on that and see what you can read of the old disk. You're getting read and write errors in chkdsk, crashes etc, the disk is on the way out.
First of all, try to get a backup of your harddrive / your data. All actions you´re performing right now can lead to a data loss.
I don´t know if there are a web tool for fixing these problems - normally, a extended chkdsk (/r /p) should´ve fix the problems. Your log shows insufficient space on the partition. Can you move some files on another disk and try to run chkdsk again?