I have some codes in kdb+(which uses q) which generates data in tabular form. The problem is I need to run each line separately and export the data to excel. Is there any way to automate this such that all the excel files can ben generated in one go?
Thanks
Whils't you can of course use csv as suggested above, .h.edsn is the function used for creating excel workbooks. It takes a dictionary of sheetNames->tables.
`:/var/tmp/excel.xls 0: .h.edsn `tab1`tab2!(([]10?10);([]20?20))
Then just open the xls file in excel.
Related
I'm at the start of learning/using SAS so I'm hoping this is easy.. I just can't find any solutions, certainly not simple enough for me to understand at this stage!
I'm looking to pass the file names(full path) to a Macro I've already got working.
I can get the file names into a dataset, but is there a way to pass these names into the Macro so that it runs it for each file one at a time? Looking to 'process' each csv in a folder into the Macro individually.
My code is pretty simple:
Info I want to pass to the Macro-
Data fName;
Run;
(A single column of 10 filepaths for example)
The Macro I want to 'push' these to is-
%processFiles(FileToProcess= '//example/test/file01.csv')
The simple data set works, the Macro works.. just can't figure out how to 'trigger' it for each file in that dataset.
Hopefully it's something easy.
Thanks
Tried several approaches; loops, passing as a list, nesting macros, etc.
Use the dataset to generate the calls to the macro.
So if the dataset is named FILES and variable with the file name is called FNAME then you just need to run this data step to call the macro once for each file in the dataset.
data _null_;
set files ;
call execute('%nrstr(%processFiles)(FileToProcess='||quote(trim(fname),"'")||')');
run;
The Function Module 'TEXT_CONVERT_XLS_TO_SAP' opens a new empty excel sheet while uploading the file.
Please give me a solution for this. I do not want the empty excel file to be opened.
Usually I use the FM ALSM_EXCEL_TO_INTERNAL_TABLE for Excel-Sheet to internal table conversion. Try it out, it's much easier.
I am working on an SSIS-2012 project, where I am receiving dataset after executing a stored procedure and saving the data in an excel file using an "Excel Destination" task in data flow. It works well, however I have a problem that for some fields, the width of columns are not large enough to show the value. Some numbers are shown like ##### for example, or part of a sentence instead of whole the end user sees etc.
Usually I handle this problem in Visual studio using a code like:
for (int i = 1; i <= RAWDATA_COLUMNS.Length; i++)
{
worksheet.Column(i).AutoFit();
}
However, here I could not find a way to handle it in SSIS. Writing a script to receive the data and save it into Excel file is not an option, using Excel Destination task in data flow is a must. Any help or advice would be appreciated.
I need to create a SSIS Package which will run daily and export a table's data into a directory. The Exported excel have a predefined format. So I have used a template excel. (Excel File with Column Headers only)
Here are the steps I followed:
Created a variable Filename with holds the location and name of the excel to be generated
(based on current date value)
Added a File System Task in Control flow. Give Source as Template Excel and Destination as the Filename variable.
Added a Dataflow Task in control flow and connect it with File System Task.
In Dataflow Task, added a OLE-DB source and configure it with the source table (the table data needs to be copied into the excel )
Added a Excel Connection manager and changed Excel File path property to filename variable.
Added a Excel Destination and configure it with Excel Connection manager.
set Delayed validation true in Data flow task and Executed the package.
Control Flow:
Data Flow:
The Package is running successfully and the excel file also get generated in the desired directory. But the excel file skips around 19000 rows copying data after that . why it is happening?
Can any one help me to solve the issue.
Thanks for the help
It is possible that the file is already formatted , and that lines are down at the bottom...often excel jumps or add lines if you do not delete the lines already used even if empty...we must also consider strange events!
I want to export data to excel.I also want to perform functionality like cell merge,cell border,text underline. But the exporting becomes slower if because of these alignment merge etc for large data. Here is a sample of excel which i do click here
I saw some articles relate to excel export using cell by cell export,using array to a range of cells etc.
Earlier i had problems with slow data export to an word file by using word object which is of about 25-35 pages.It was taking 1 to 2 mins.For this problem i used rtf file format without using word object by which i was able to export in some seconds.I used file I/O to write data.
So can i use similar kind of any excel file format aware methods which can export to excel faster with functions like cell merge , alignment,border etc ?