I'm trying to import my csv file into my sql database using like this but I'm not sure why it's saying
Exception calling "ExecuteWithResults" with "1" argument(s): "Value cannot be null.
Parameter name: sqlCommands"
even though I don't have Null value in my csv file and also I make sure my table columns to accept null value.
$s = New-Object Microsoft.SqlServer.Management.Smo.Server "server name"
$db = $s.Databases.Item("LitHold")
$csvfile = import-csv -delimiter ";" -path "C:\scripts\LitHold-OneDrive\output\Return\2022-01-12-Return.csv"
$csvfile |foreach-object{
$query = "insert into DailyReport VALUES ('$($_.MIN)','$($_.MID)','$($_.UPN)','$($_.Change)','$($_.Type)','$($_.HoldValue)','$($_.OneDrive)','$($_.Mailbox)','$($_.Created)','$($_.Modified)','$($_.MultMID)','$($_.Account)','$($_.ExistOD)')"
}
$result = $db.ExecuteWithResults($query)
# Show output
$result.Tables[0]
My csv file
//The top one is my columns name and it's already inside my table
"MIN","MID","UPN","Change","Type","Hold Value","OneDrive","Mailbox","Created","Modified","Mult MID","Account","Exist OD"
"338780228","lzlcdg","lzlcdg#NAMQA.COM","Hold Created","OneDrive and Mailbox","Y","https://devf-my.sharepoint.com/personal/lzlcdg_namqa_corpqa_geuc_corp_com","lzlcdg#NAMQA.CORPQA.GEUC.CORP.COM","1/11/2022 11:38:57 AM","1/11/2022 11:38:57 AM","N","",""
"419150027","lzs8rl","lzs8rl#.corpq.com","Hold Created","OneDrive and Mailbox","Y","https://my.sharepoint.com/personal/lzs8rl_namqa_corpqa_gcom","lzs8rl#namqa.corpqa.geuc.corp.com","1/11/2022 11:39:05 AM","1/11/2022 11:39:05 AM","N","",""
Don't remove the column headers, but double check how they are written.. with spaces
Your code ignores those here
$($_.HoldValue) --> $($_.'Hold Value')
$($_.MultMID) --> $($_.'Mult MID')
$($_.ExistOD) --> $($_.'Exist OD')
Either keep the code and rewrite the headers (take out the spaces) or make sure you use the property names according to the headers.
By removing the column headers, the first line in the csv file wil be used as column headers unless you supply new ones with parameter -Header. Removing headers will cause problems if the same field value is encountered more than once because column headers must be unique
Then there is this line:
$result = $db.ExecuteWithResults($csvfile)
which should be
$result = $db.ExecuteWithResults($query)
AND there is no point in looping over the records of the csv file and inside that loop overwrite your query string on every iteration so only the last record wil remain...
Related
Hi Glorius People of the Interwebz!
I come to you with a humble question (please go easy on me, I am fairly OK in PowerShell, but my SQL skills are minimal... :( )
So I have been tasked with to write a powershell script to import data (from a number of csv files to a database) and I made a good progress, based on this (I heavily modified my version). All works dashingly, except one part: when I try to insert the values (I created a sort of "mapping file" to map the csv headers to the data), I can't seem to use the created string in the values part. So here is what I have:
This is my current code for powershell (ignore the comments)
This is a sample data csv
This is my mapping file
What I would want, is to replace the
VALUES(
'$($CSVLine.Invoice_Status_Text)',
'$($CSVLine.Invoice_Status)',
'$($CSVLine.Dispute_Required_Text)',
'$($CSVLine.Dispute_Required)',
'$($CSVLine.Dispute_Resolved_Text)',
'$($CSVLine.Dispute_Resolved)',
'$($CSVLine.Sub_Account_Number)',
'$($CSVLine.QTY)',
'$($CSVLine.Date_of_Service)',
'$($CSVLine.Service)',
'$($CSVLine.Amount_of_Service)',
'$($CSVLine.Total)',
'$($CSVLine.Location)',
'$($CSVLine.Dispute_Reason_Text)',
'$($CSVLine.Dispute_Reason)',
'$($CSVLine.Numeric_counter)'
);"
Part, for example with a string generated this way:
But when I replace the long - and honestly, boring to type - values with the $valueString, I get this type of error:
Incorrect syntax was encountered while parsing '$($'.
Not sure, if it matters, but my PS is 7.1
Any good people who can give a good suggestion on how to build the values from my text file...?
Ta,
F.
As commented, wrapping variables inside single-quotes takes the variable as written literally, so you do not get the value contained (7957), but a string like $($CSVLine.Numeric_counter) instead.
I don't do SQL a lot, but I think I would change the part where you construct the values to insert like this:
# demo, read the csv file in your example
$csv = Import-Csv D:\Test\test.csv -Delimiter ';'
# demo, these are the headers (or better yet, the Property Names to use from the objects in the CSV) as ARRAY
# (you use `$headers = Get-Content -Path 'C:\Temp\SQL\ImportingCSVsIntoSQLv1\config\headers.txt'`)
$headers = 'Invoice_Status_Text','Invoice_Status','Dispute_Required_Text','Dispute_Required',
'Dispute_Resolved_Text','Dispute_Resolved','Sub_Account_Number','QTY','Date_of_Service',
'Service','Amount_of_Service','Total','Location','Dispute_Reason_Text','Dispute_Reason','Numeric_counter'
# capture formatted blocks of values for each row in the CSV
$AllValueStrings = $csv | ForEach-Object {
# get a list of values using propertynames you have in the $headers
$values = foreach ($propertyName in $headers) {
$value = $_.$propertyName
# output the VALUE to be captured in $values
# for SQL, single-quote the string type values. Numeric values without quotes
if ($value -match '^[\d\.]+$') { $value }
else { "'{0}'" -f $value }
}
# output the values for this row in the CSV
$values -join ",`r`n"
}
# $AllValueStrings will now have as many formatted values to use
# in the SQL as there are records (rows) in the csv
$AllValueStrings
Using your examples, $AllValueStrings would yield
'Ready To Pay',
1,
'No',
2,
'',
'',
'',
'',
'',
'',
'',
'',
'',
'',
'',
7957
Can anyone please help me in writing a script in AHK based on below requirement.
Requirement:
I have a CSV/TXT file in my windows environment which contains 20,000+ records in below format.
So, when I run the script it should prompt a InputBox to enter an instance name.
Example : If i enter Instance4 , it should display result in MsgBox as ServerName4
Sample Format:
ServerName1,ServerIP,Instance1,Type
ServerName2,ServerIP,Instance2,Type
ServerName3,ServerIP,Instance3,Type
ServerName4,ServerIP,Instance4,Type
ServerName5,ServerIP,Instance5,Type
.
.
.
Also as the CSV/TXT file contains large no of records , pls also consider the best way to avoid delay in fetching the results.
Please post your code, or at least show what you've already done.
You can use a Parsing Loop with CSV as the delimiter, and make a variable for each 'Instance' who's value is that of the current row's 'ServerName'.
The steps are to first FileRead the data from the file, then Loop, Parse like so:
Loop, Parse, data, CSV
{
; Parses row by row, then column by column in each row.
; A_LoopField // Current value
; A_Index // Current loop's index
; Write a script that makes a variable named with the current value of column 3, and give it the value of column 1
}
After that, you can make a Goto loop that spams InputBox and following a command that prints out the needed variable using the MsgBox command, like so:
MsgBox % %input%
I'm trying to automate writing CSV files to an RSQLite DB.
I am doing so by indexing csvFiles, which is a list of data.frame variables stored in the environment.
I can't seem to figure out why my dbWriteTable() code works perfectly fine when I enter it manually but not when I try to index the name and value fields.
### CREATE DB ###
mydb <- dbConnect(RSQLite::SQLite(),"")
# FOR LOOP TO BATCH IMPORT DATA INTO DATABASE
for (i in 1:length(csvFiles)) {
dbWriteTable(mydb,name = csvFiles[i], value = csvFiles[i], overwrite=T)
i=i+1
}
# EXAMPLE CODE THAT SUCCESSFULLY MANUAL IMPORTS INTO mydb
dbWriteTable(mydb,"DEPARTMENT",DEPARTMENT)
When I run the for loop above, I'm given this error:
"Error in file(file, "rt") : cannot open the connection
In addition: Warning message:
In file(file, "rt") :
cannot open file 'DEPARTMENT': No such file or directory
# note that 'DEPARTMENT' is the value of csvFiles[1]
Here's the dput output of csvFiles:
c("DEPARTMENT", "EMPLOYEE_PHONE", "PRODUCT", "EMPLOYEE", "SALES_ORDER_LINE",
"SALES_ORDER", "CUSTOMER", "INVOICES", "STOCK_TOTAL")
I've researched this error and it seems to be related to my working directory; however, I don't really understand what to change, as I'm not even trying to manipulate files from my computer, simply data.frames already in my environment.
Please help!
Simply use get() for the value argument as you are passing a string value when a dataframe object is expected. Notice your manual version does not have DEPARTMENT quoted for value.
# FOR LOOP TO BATCH IMPORT DATA INTO DATABASE
for (i in seq_along(csvFiles)) {
dbWriteTable(mydb,name = csvFiles[i], value = get(csvFiles[i]), overwrite=T)
}
Alternatively, consider building a list of named dataframes with mget and loop element-wise between list's names and df elements with Map:
dfs <- mget(csvfiles)
output <- Map(function(n, d) dbWriteTable(mydb, name = n, value = d, overwrite=T), names(dfs), dfs)
In PIG, When we load a CSV file using LOAD statement without mentioning schema & with default PIGSTORAGE (\t), what happens? Will the Load work fine and can we dump the data? Else will it throw error since the file has ',' and the pigstorage is '/t'? Please advice
When you load a csv file without defining a schema using PigStorage('\t'), since there are no tabs in each line of the input file, the whole line will be treated as one tuple. You will not be able to access the individual words in the line.
Example:
Input file:
john,smith,nyu,NY
jim,young,osu,OH
robert,cernera,mu,NJ
a = LOAD 'input' USING PigStorage('\t');
dump a;
OUTPUT:
(john,smith,nyu,NY)
(jim,young,osu,OH)
(robert,cernera,mu,NJ)
b = foreach a generate $0, $1, $2;
dump b;
(john,smith,nyu,NY,,)
(jim,young,osu,OH,,)
(robert,cernera,mu,NJ,,)
Ideally, b should have been:
(john,smith,nyu)
(jim,young,osu)
(robert,cernera,mu)
if the delimiter was a comma. But since the delimiter was a tab and a tab does not exist in the input records, the whole line was treated as one field. Pig doe snot complain if a field is null- It just outputs nothing when there is a null. Hence you see only the commas when you dump b.
Hope that was useful.
I have a power-shell script with which I am trying to back up a constantly changing number of SQL databases. Fortunately all of these databases are listed in a registry key. I am leveraging this in a for-each loop. The issue that I am having is that after grabbing the registry value that I want, when I try to pass it into my function to back up the databases there seems to be information in the variable that I can get rid of. If I output the contents of the variable to the screen by just calling the variable ($variable) is shows just fine. But if I write-host the variable to the screen the extra "content" that shows up when calling the function also shows up.
Here is the part of the script that generates the contents of the variable.
foreach ($childitem in get-childitem "HKLM:\SOFTWARE\Wow6432Node\Lanovation\Prism Deploy\Server Channels")
{$DBName = get-itemproperty Registry::$childitem | select "Channel Database Name"
write-host $DBname}
Here is what write-host displays :
#{Channel Database Name=Prism_Deploy_Sample_268CBD61_AC9E_4853_83DE_E161C72458DE}
but what I need is only this part :
Prism_Deploy_Sample_268CBD61_AC9E_4853_83DE_E161C72458DE
I have tried looking online at how to do this, and what I've found mentions things similar to $variable.split and then specifying my delimiters. But when I try this I get an error saying "Method invocation failed because [System.Management.Automation.PSCustomObject] doesn't contain a method named 'split'."
I'm at a loss as to where to go from where I'm at currently.
select-object will return an object that has the named properties that you "select". To get just value of that property, just access it by name:
write-host $DBname."Channel Database Name"
Sounds like it's returning a hash table row object.
Try
write-host $DBName.value
or, failing that, do a
$DBName | Get-member
When in doubt, get-member gives you a nice idea of what you are dealing with.
You should be able to write
foreach ($childitem in get-childitem "HKLM:\SOFTWARE\Wow6432Node\Lanovation\Prism Deploy\Server Channels")
{$DBName = get-itemproperty Registry::$childitem | select "Channel Database Name"
write-host $DBname.Name}
to get what you are looking for