I am tring to create a excel file in IOS, I manage to do this simple creating our string and writing it to a file with extention .csv, This fine but the problem is that all the data is comming on a single cell. Can any help me out with some code.
You can just insert a comma in the string between all seperate columns and a newline for every row, then write it CSV and it'll work.
Good luck
Related
I am new to Pentaho. I am trying to build a transformation that can convert a bunch of .xlsx files to .csv (utf-8).
I tried Get file Names and Text File Output, but it saves a single file as csv and the content of that file is the file properties.
I also tried Microsoft Excel Input and Microsoft Excel Output and that did not work either.
Any help will be appreciated. TIA!
I have prepare a SOLUTION for you. I have made my solution full dynamic. For that reason solution is combination of 6 (transformation & job). You only need to define following 2 things:-
Source folder location
Destination folder location
Others will work dynamically.
Also, I have learn a lot with this solution.
Would you like to generate a separate CSV for each Excel file?
It is better to do it like this:
Using the Get File Names component, read the list of Excel files from the folder.
Then call Execute Transformation, and pass the name of the file.
Then a separate Transformation will be performed for each file, and a separate CSV will be generated in it for each Excel file.
I want to insert a big amount of data from a TXT file into a SQLite database using VBA.
Basically I have already generated my TXT files, but I couldn't find any code or information about how I could insert them automatically into SQLite. Oh yes, and before I got to this point a tried to save the same data directly in SQLite, but it's taking too much time to record all the data, that is spread in more than one database file according to a rule that must be followed.
So, I don't know the exact number, but it's something about 600K records and it's taking a whole day. And once I realised the recording process was much faster when I used TXT files, I want to input this TXT file into the my database, automatically.
As I said I couldn't find anything similar, but I guess it's just connect SQLite and VBA and then send a command to insert a TXT file "X" into a specified database "Y".
But of course, if someone knows how I could record my data faster it would also be a solution, once the reason that I want insert from TXT to SQLite is because from excel (VBA) to SQLite is taking a lot of time. So, that is what I got: VBA -> SQLite (slow); I want: VBA -> TXT -> SQLite; But till the moment I only have: VBA -> TXT.
First, a grumble: MS builds SQL Server Studio AND Excel, but can't make one save in the standard format of the other?
OK, I'm a data analyst, but not allowed to change/mod either the data or structures directly. So full READ, but no WRITE.
I'm trying to do a dump so I can do some of this analysis offline, as I have no remote access either.
So one VARCHAR2 column in this table is for comments on the purchase of the asset being described/tracked. Of course, there are commas. The only export types built into SQL Server Studio are .csv and .txt, and .csv just turns into a mess when 'comma' is included as a delimiter.
So after an hour or so of screwing around with this, (including reading a thread on methods for excluding the one column from a SELECT while still exporting the other 221 columns in the table, without having to write them all out manually (fun reading, impressive, but means I'd have to figure out which of them actually works, and then still export the one column separately and insert it in the Excel separately)) I am throwing this problem on the pile at StackOverflow.
Someone else must have worked around this frustration of the .csv format as export VS the commas embedded in 'comment' text.
Any help would be appreciated.
Why don't you simply select all data in ssms result window, then copy and then paste in a blank excel file?
It should copy paste all data in correct format including comma valued fields in single column.
Try that.
So If you replace the ' to some special character you can export it.
Select
Replace(columnName,'''','`')
from Table
Other solution if you use the manager studio
https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/start-the-sql-server-import-and-export-wizard
I want to use a format file to help import a comma delimited file using bulk insert. I want to know how you generate format files from a flat file source. The microsoft guidance on this subjects makes it seem as though you can only generate a format file from a SQL table. But I want it to look at text file and tell me what the delimiters are in that file.
Surely this is possible.
Thanks
The format file can, and usually does include more than just delimiters. It also frequently includes column data types, which is why it can only be automatically generated from the Table or view the data is being retrieved from.
If you need to find the delimiters in a flat file, I'm sure there are a number of ways to create a script that could accomplish that, as well as creating a format file.
I'm trying to create a csv in my bash script from some values I'm getting from another non-csv file.
The problem I see is that the values have commas(,) in them.
The csv file wrong because of that (the values with commas in them are 2 or more different values now)
Is there any way to get rid of that problem or any other way to build a csv in a bash script. I can create any other files too, it just needs to be compatible with standard sql import.
Thanks
I now added quotation marks before and after every value and it works great. The csv looks like it should.