VB.NET Creation of Perfomance Counter Data on a Remote Server - vb.net

I'm looking for a way to create a Binary Circular log file with a specific set of counters on a remote system.
I know this can be done using logman and psexec but I was hoping to find a cleaner example using VB.NET rather then breaking out and running an external program.
Here is the command line I would like to replace:
psexec \\%1 logman create counter BlackBox -f bincirc -max 500 -si 00:00:15 -b 01/01/2011 00:00:00 --v -o "C:\PerfLogs\BlackBox" -c "\Cache\*" "\LogicalDisk(*)\*" "\Memory\*" "\Network Interface(*)\*" "\Paging File(*)\*" "\PhysicalDisk(*)\*" "\Process(*)\*" "\Processor(*)\*" "\Redirector\*" "\Server Work Queues(*)\*" "\Server\*" "\System\*" "\Objects\*"
Thanks in advance for your help.

Related

Unable to run .sql file in SQL Server

I have a .sql dump file 20 gb and I am trying to run it on Mysql workbench using run script and after successful execution, using SSMA I'll migrate the data from Mysql workbench to SQL Server. I have migrated the data this way many times successfully however for 20 gb file it seems very time-consuming. Please let me know if there is any alternate way to achieve this quickly. I have followed the following link:
Steps to migrate mysql tables to sql server using SSMA!
From your Title "unable to run .sql file in SSMS" and "I have a .sql dump file 20 gb" are you trying to open a 20GB .sql in SSMS? That's never going to work. SSMS is a 32bit application, so the maximum addressable memory is 2GB. If you want to run your .sql file, I suggest using sqlcmd.
Open up Powershell, and then run the command below replacing the appropriate parts:
sqlcmd -S {Server Name/ServerIP} -U {Your Login} -i {Your full path to your script}
You'll be prompted for your password and then you the file will be run. So, as an example, you might run:
sqlcmd -S svSQL2017 -U Larnu -i \\svFileServer\SQLShare\Scripts\BigBatchFile.sql
If you are using integrated security, then don't pass the -U parameter for the command.
Edit: This answer is no relevant to the OPs question, as they were using "SSMS" as a synonym for SQL Server, which it is not. I have left this here for the moment so the OP can review my comments, and I will likely remove this answer at a later point.

Using BCP utility from R

Is it possible to use the BCP utility in R?
I'm currently using the RODBC package to read from a remote SQL server, but am experiencing slow transfer of data from sqlFetch() which could be alleviated with the use of BCP.
Yes it is possible.
First make sure you can run the BCP utility everywhere by including the path in the Environment Variables of Windows or you can use the full file path.
Then run:
shell("bcp dbName.dbo.tableName in mydata.csv -F 2 -S sqlSrvr -T -f bcp.fmt")
This should be exactly as if you were running it from the cmd prompt.
The hard part is setting up your data so it matches the format file.

T-SQL - Scripting variable not defined

I'm new to T-SQL and I'm trying to backup my databases (using SQL Server 2008).
When I try to run the script via sqlcmd -i inputfile I got this error messages:
'DATE' Scripting variable not defined.
The problem is I have a line like this:
...TO DISK = "FileName_$(ESCAPE_NONE(DATE)).BAK" ...
With a date in a filename, it will prevent it from replacing my old backups.
If I run it in management studio, it works, but if I run it in command line with the sqlcmd -i command, then it doesn't work.
EDIT:
I looked at the job history and I saw this error message:
"For SQL Server 2005 SP1 or later, you must use the appropriate ESCAPE_xxx
macro to update job steps containing tokens before the job can run"
I don't quite understand what that means. I've already used $ESCAPE_NONE(DATE), what's wrong?
Old question I know but this is one of the first results and if anyone else has the same problem the answer isn't particularly easy to find.
Including the -x switch to disable environment variables fixed the problem for me;
sqlcmd -x -i inputfile
If you're trying to backup your sql server databases and append the date to them using sqlcmd there's an easy thing you can try.
First, create the sp called sp_BackupDabases which you can find here:
http://support.microsoft.com/kb/2019698
You can invoke it from sql cmd using some command like this:
sqlcmd -U Damieh -P ilovechocolate -S (local) -Q "EXEC sp_BackupDatabases #backupLocation ='C:\MyBackups\', #BackupType='F'"
I'm sure you know this already, but just in case: -U is the user, -P is password, -S is server, and -Q is query. You can either backup all of your databases or some of them, there are parameters for that. You can find the stored proc parameters details on the same link I gave you.
The date will be automatically appended and you can play with the sp's code if you want it in a different place/way/format. I use this regularly on servers which don't have a non-express sqlserver (meaning that I can't schedule backups without using a .bat and task scheduler) with great success.
I apologize if this wasn't the answer you were looking for =). Have a nice day!
I know I'm coming along late on this thread, but you can use the following:
SQLCMD -S YourServer -E -d YourDatabase -i YourScript 2> nul
That will send the StdErrorOut to the bit bucket.

Is there a way to directly compress/zips the result from a SQL query?

Off late I have been dumping relatively large tables using SSMS. The usual way is to set Query->Results-To->File, 'Execute`, choose a file and let the SQL query run. After it finishes, I usually zip the file and then transfer it to my local machine. This has obvious problems of the host machine running out of space during overnight SQL queries.
I was wondering if there is a way to compress the output from SSMS directly without having to wait until it dumps the results from the entire query. Any suggestions? The host machine is pretty restricted in what it allows me to run on it so a suggestion that requires minimal third-party software would be great.
Run the queries from sqlcmd instead and pipe the output into a command line zip (you'll need to install one, see What's a good tar utility for Windows?). Or you can use PowerShell that can zip out-of-the-box, including piped input, see Compress Files with Windows PowerShell then package a Windows Vista Sidebar Gadget, this requires no additional tools as PS is already on your host server (although on second read I think the PS solutions, as in the link, still requires a deflated file first, cannot compress on-the-file).
Sample query using sqlcmd and 7zip:
sqlcmd -S <DATABASE> -s <COLUMNSEP> -Q "SELECT ..." | .\7za.exe a -si <FILENAME>
Remember to use the -Q (run query and exit) and not the -q (run query) or else this won't work.

SQL Server 2008 Job based on changing Stored Procedures

I have looked through the SQL Server questions and answers and I didn't see an answer to this one, if it is out there and I've missed it, please let me know.
Here's the situation:
I write stored procedures and views that are then run as reports (using Crystal) - this is not the problem. Before I am able to release the reports into Production, I need to have the end users run the reports and check them for errors, etc. In a perfect world, I would have a frozen test environment, but I don't live in a perfect world. Every night everything I place into my test environment is wiped out and every morning anything that is in end user testing needs to be re-added. This means that when I come in the first thing I do is run all of the stored procedure, along with a script that unhides the reports in the program we use.
What I'd like to be able to do is to write a package that would find all of the stored procedures in a folder and execute them to add them to the database and, then, run the script that unhides the reports.
I know how to set up an SSIS package to run a stored procedure, but I don't know how to set one up that would run an ever changing list of stored procedures. Is this even possible? And, if it is, how do I go about starting this up?
I should note that while I have more then 10 years of query writing experience, I haven't used VB since VB 6.0 and I very new to the SSIS and SSRS world.
Thanks in advance!
Good old nt shell will do the trick. Run this statement in the folder containing the files.
for %A in (*.sql) DO sqlcmd -i %A -S <myServer> -d <myDb> -E
if you want to include it in a batch file it could look like
#echo off
for %%A in (*.sql) DO sqlcmd -i %%A -S <myServer> -d <myDb> -E
sqlcmd -i script_to_update_config.sql -S <myServer> -d <myDb> -E
This actually sounds like it may be more of a deployment issue than a SQL one. Take a look at Jenkins CI. I believe it's mostly used for code build and deployment, but it can also be used for any automated task.
If you had one SQL file that listed all the changed procs and their associated files you can use that single script to run all the others http://www.devx.com/tips/Tip/15132. For that matter you could just use a scheduled task to run it every morning.
Adding one more step, you could build the file based on the contents of a folder (using a little Powershell script or the like).
I'm not sure trying to do this all within an SSIS package is the right tool for the job.