I have a database with 6 columns and several thousand rows in which I'd like to export the contents into separate, named text files.
id title text text_2
1 blah lorem ipsem... indigo violet...
2 gunf ipsem lorem... up down left...
3 faff sir I have a... amarillo albuquerque...
I'd like to create the following text files for each row:
filename = id - title.txt;
content = title
filename = id - title.txt;
content = text
filename = id - title.txt;
content = text_2
I've looked and can't think how to do it. I used to use a macro in Excel to convert cells to txt but the text is too big for Excel's cell character limit.
I'm using SQLite but am not wedded to it (though would rather not have to buy a program if I can avoid it).
Any advice on what to do? While not too techy I can follow some basic code.
You can use SQLiteStudio 3.x.x and it's custom SQL functions.
Open "Custom SQL Functions editor" (the one with "fx" icon) and add new function, let's call it saveToFile. Select Tcl as implementation language (it's in the top-right corner), leave "Type" as Scalar. If you want you can define input arguments (it's not mandatory, it's just so the code assistant will help you later on when invoking this function): fileName and contents. It's also okay to keep "Register in all databases" option.
Now the most important thing - enter following implementation code:
if {[catch {
lassign $argv fileName contents
set fd [open "C:/tmp/a/$fileName" a+]
puts $fd $contents
close $fd
} res]} {
return "error: $res"
} else {
return "ok"
}
The code contains C:/tmp/a which is path to directory where your files will be created. Change it to whatever you want. The directory must exist.
Commit your function (commit button is on top of functions editor).
Now open SQL editor window (the one with blank paper and a pencil) and type query like this:
SELECT saveToFile(id || ' - ' || title || '.txt', title || x'0a' || text || x'0a' || text_2) AS result FROM table_name;
This query will create one file per row and will put title, text and text_2 columns as 3 separate lines in the file.
The query will return for each row status ok when there was no problem, or error: ... with error details when there was problem, for example with creating a file.
Related
Can anyone please help me in writing a script in AHK based on below requirement.
Requirement:
I have a CSV/TXT file in my windows environment which contains 20,000+ records in below format.
So, when I run the script it should prompt a InputBox to enter an instance name.
Example : If i enter Instance4 , it should display result in MsgBox as ServerName4
Sample Format:
ServerName1,ServerIP,Instance1,Type
ServerName2,ServerIP,Instance2,Type
ServerName3,ServerIP,Instance3,Type
ServerName4,ServerIP,Instance4,Type
ServerName5,ServerIP,Instance5,Type
.
.
.
Also as the CSV/TXT file contains large no of records , pls also consider the best way to avoid delay in fetching the results.
Please post your code, or at least show what you've already done.
You can use a Parsing Loop with CSV as the delimiter, and make a variable for each 'Instance' who's value is that of the current row's 'ServerName'.
The steps are to first FileRead the data from the file, then Loop, Parse like so:
Loop, Parse, data, CSV
{
; Parses row by row, then column by column in each row.
; A_LoopField // Current value
; A_Index // Current loop's index
; Write a script that makes a variable named with the current value of column 3, and give it the value of column 1
}
After that, you can make a Goto loop that spams InputBox and following a command that prints out the needed variable using the MsgBox command, like so:
MsgBox % %input%
I have tried the readLines and the read.csv functions but then don't work.
Here is the contents of the my_script.sql file:
SELECT EmployeeID, FirstName, LastName, HireDate, City FROM Employees
WHERE HireDate >= '1-july-1993'
and it is saved on my Desktop.
Now I want to run this query from my R script. Here is what I have:
conn = connectDb()
fileName <- "C:\\Users\\me\\Desktop\\my_script.sql"
query <- readChar(fileName, file.info(fileName)$size)
query <- gsub("\r", " ", query)
query <- gsub("\n", " ", query)
query <- gsub("", " ", query)
recordSet <- dbSendQuery(conn, query)
rate <- fetch(recordSet, n = -1)
print(rate)
disconnectDb(conn)
And I am not getting anything back in this case. What can I try?
I've had trouble with reading sql files myself, and have found that often times the syntax gets broken if there are any single line comments in the sql. Since in R you store the sql statement as a single line string, if there are any double dashes in the sql it will essentially comment out any code after the double dash.
This is a function that I typically use whenever I am reading in a .sql file to be used in R.
getSQL <- function(filepath){
con = file(filepath, "r")
sql.string <- ""
while (TRUE){
line <- readLines(con, n = 1)
if ( length(line) == 0 ){
break
}
line <- gsub("\\t", " ", line)
if(grepl("--",line) == TRUE){
line <- paste(sub("--","/*",line),"*/")
}
sql.string <- paste(sql.string, line)
}
close(con)
return(sql.string)
}
I've found for queries with multiple lines, the read_file() function from the readr package works well. The only thing you have to be mindful of is to avoid single quotes (double quotes are fine). You can even add comments this way.
Example query, saved as query.sql
SELECT
COUNT(1) as "my_count"
-- comment goes here
FROM -- tabs work too
my_table
I can then store the results in a data frame with
df <- dbGetQuery(con, statement = read_file('query.sql'))
You can use the read_file() function from the readr package.
fileName = read_file("C:/Users/me/Desktop/my_script.sql")
You will get a string variable fileName with the desired text.
Note: Use / instead of \\\
The answer by Matt Jewett is quite useful, but I wanted to add that I sometimes encounter the following warning when trying to read .sql files generated by sql server using that answer:
Warning message: In readLines(con, n = 1) : line 1 appears to contain
an embedded nul
The first line returned by readLines is often "ÿþ" in these cases (i.e. the UTF-16 byte order mark) and subsequent lines are not read properly. I solved this by opening the sql file in Microsoft SQL Server Management Studio and selecting
File -> Save As ...
then on the small downarrow next to the save button selecting
Save with Encoding ...
and choosing
Unicode (UTF-8 without signature) - Codepage 65001
from the Encoding dropdown menu.
If you do not have Microsoft SQL Server Management Studio and are using a Windows machine, you could also try opening the file with the default text editor and then selecting
File -> Save As ...
Encoding: UTF-8
to save with a .txt file extension.
Interestingly changing the file within Microsoft SQL Server Management Studio removes the BOM (byte order mark) altogether, whereas changing the file within the text editor converts the BOM to the UTF-8 BOM but nevertheless causes the query to be properly read using the referenced answer.
The combination of readr and textclean works well without having to create any new functions. read_file() reads the file into a character vector and replace_white() ensures all escape sequence characters are removed from your .sql file. Note: Does cause problems if you have comments in your SQL string !!
library(readr)
library(textclean)
SQL <- replace_white(read_file("file_path")))
I have a table with about 900 records. A sample record looks like this:
Field Names:
ID FNN DSLAM_ID SHORT_CODE PORT_TYPE PANEL SLOT CHANNEL CONNECTION_TYPE SERVICE_TYPE PVCID CHANNEL_TYPE PROD_CODES
Record 1:
1 A99TEST9999 QXXXXENNNN ABCDE DSL48P 1 11 38 ABC ADSL RANDOMIDXXYY N ADESP=NNNNNNN_ABCDEFG_L2PPP
I'd like to build a text file, where for each record it builds a new line and inputs a specific field as a variable.
An example Line:
FNN="[FNN]" : ACTION="" : SERVICE_TYPE="[CONNECTION_TYPE]" : NE_ID="[DSLAM_ID]", NE_DEFN="[SERVICE_TYPE]", PORT="[PANEL] / [SLOT] / [CHANNEL]"
I've seen people write scripts to create Router Configurations before and essentially this is what I want to do to build a Mass Configuration File for an application.
You'll need to get the recordset object and then do something like this:
Open "yourfilename.txt" for Output as #1
While not (recordset.eof)
Print #1, "FNN=" & recordset.fields("FNN").value (add the rest of your string here...)
recordset.movenext
Wend
Close #1
Technically, instead of "#1" you should grab the file number by using the FreeFile() function.
Code:
set heading off
set arraysize 1
set newpage 0
set pages 0
set feedback off
set echo off
set verify off
spool 'c:\farmerList.csv'
/
select FIRSTNAME','LASTNAME','TRN','CELL','PARISH
spool off
The file is being saved to the directory, however it is saving the "select FIRSTNAME','LASTNAME','TRN','CELL','PARISH" and not the results of the query in csv format. What am i doing wrong?
Your select is incomplete as you don't have a from clause, but not sure if you've lost that in the copy-and-paste. As it is there is nothing to run, since the partial statement is never executed (no terminating ; or / on the next line). If you did have a from farmers; clause then it would show the command plus an ORA-00923 error, probably.
You can't just put a quoted comma between the fields, you need to concatenate the fields with that character using the || concatenation symbol:
spool 'c:\farmerList.csv'
select FIRSTNAME
||','|| LASTNAME
||','|| TRN
||','|| CELL
||','|| PARISH
from farmers;
gives a file containing
Joe,Grundy,X,Y,Ambridge
The fields don't have to be on separate lines, I jut find that easier to read and keep track of the commas.
You don't need the / after the spool command - that will re-excute the last statement before the spool, if there is one - and you don't need the quotes around the spool file name unless it contains spaces, but they don't hurt.
There's also a set colsep command which you can use to make the column separator into a comma, but you have to worry about padding, so I find it easier to concatenate the columns together as you're (almost) doing.
Except that's for SQL*Plus, as I didn't notice the SQL Developer reference in the title. Spool is a bit odd in Developer as it seems to trap and echo things you probably don't want, and not all of the set commands work (which ones depends on the version).
The safer and preferred way, I think, is to run a normal query without concatenated commas:
select FIRSTNAME, LASTNAME, TRN, CELL, PARISH
from farmers;
and with 'run' rather than 'run script', so that the results appear in the grid view in the query result window. Right-click on the grid and choose 'export'. You can then save as a CSV, or even as an XLS, and can choose to not have a header row if you prefer.
this is the correct solution please go through this
import java.sql.*;
import java.io.*;
import au.com.bytecode.opencsv.CSVWriter;
public class TableExport {
public static void main(String[] args) {
try{
DriverManager.registerDriver(new oracle.jdbc.driver.OracleDriver());
Connection conn = DriverManager.getConnection("jdbc:oracle:thin:#localhost:1521:XE","name","password");
conn.setAutoCommit(false);
Statement statement = conn.createStatement();
ResultSet resultData = statement.executeQuery("select * from your_table");
CSVWriter writer = new CSVWriter(new FileWriter(new File("D:/Uploads/Output5.csv")), '|');
writer.writeAll(resultData, true);
writer.close();
}catch (Exception e){
System.out.println("Error" +e);
}
}
}
if anyone likes this please note you would need oracle-jdbc.jar and opencsv1.7.jar in library folder to properly execute this code.
Using Vim, I'm trying to pipe text selected in visual mode to a UNIX command and have the output appended to the end of the current file. For example, say we have a SQL command such as:
SELECT * FROM mytable;
I want to do something like the following:
<ESC>
V " select text
:'<,'>!mysql -uuser -ppass mydb
But instead of having the output overwrite the currently selected text, I would like to have the output appended to the end of the file. You probably see where this is going. I'm working on using Vim as a simple SQL editor. That way, I don't have to leave Vim to edit, tweak, test SQL code.
How about copying the selected text to the end of the file, select the copy and run the command? If you do not want to repeat the same commands over and over again, you can record the sequence by using q or add a new command. I have tried the latter as follows:
:com -range C <line1>,<line2>yank | $ | put | .,$ !rev
With it you can select some lines and then type :C. This will first yank the selection, then go to the end of the file, paste the yanked text and run the command (rev in this case) over the new text.
If you prefer more programmatic approach, you can have
:call append(line("$"), system("command", GetSelectedText()))
where GetSelectedText is the reusable function:
func! GetSelectedText()
normal gv"xy
let result = getreg("x")
normal gv
return result
endfunc
Try
:r | YourCommand
For example:
:r ! echo foo
adds foo to your buffer.