Filter certain SQL data formatted in one column into a new column - sql

Before I begin I found this to be most relevant with the research I have done.
How to split the data from one column into separate columns using the contents of another column in SQL
Attached are pictures of my progress so far. How can I display this information such as it is shown in the excel file without disrupting the GROUP BY filter in my Query?
It's a Fishbowl Database, newest version. I am running the queries through Flamerobin which you see in the picture. Trying to organize the query to display correctly so I can format it into 'iReports' and export it into an excel spreadsheet like the one shown. Maybe there is some part of this that would better be done in excel?
Notice the numbers for Qty are different, that's ok right now.
My reputation is too low to post pictures I am sorry. Here are the two JPGs in my Dropbox. I really appreciate the help.
https://www.dropbox.com/sh/r2rw5r2awsyvzs9/AAAXXg27CMPOYtZFqPX3Dx6la?dl=0

Related

Use an Excel Cell as a Input Data for filtration to a SQL Server View

I have a basic view in SQL Server which includes these columns,
"Lot Number" , "Time" , "Code" , "Name" , "Barcode"
Now I need a Excel file which will have EXACTLY the same columns and the same information.
What I want to do is, in the Excel file when I write a number to "lot number" section, the other columns will fill itself automatically based on the SQL Server view. Basically I'll be using "lot number" cell as a filtering.
The problem is that, this is a ERP system database. I don't want to extract all the information to Excel file, there are millions of rows. Excel should be blank, I will write the lot number first, then refresh the Excel file and the other columns should be filled in automatically. I will refresh the Excel every 1 min automatically and only 1 row is enough.
So I need a procedure in SQL Server (or in Excel) to receive data from the Excel cell, then use that data for filtration in a view and return the data to the same Excel.
Now I'll show you my set up, waiting for your ideas.
This Screen shot is the SQL Server view in "192.168.2.100" server. My SQL knowledge is pretty basic, I create views by schematics not by codes.
And this is the Excel file that I connected. As you can see I can extract all the data. (I deleted code-name-barcode cells. they are not empty, because these are official information).
Obviously this is not I want. When I put the lot number I want other data to came automatically. At least when I refresh the Excel file. Excel is in "192.168.2.116\\barkod\\Tyvek_Mdr.xls" . It is in a shared network.
So I have been researching but still I couldn't find the best solution. If someone can help me I'd really appreciate it.

Microsoft Access Table Shows Up Blank, But Query Correctly Pulls Data From Table

I am having an issue with my Microsoft Access database. One of my tables looks completely blank, but it has 11632 records listed in the bottom. Take a look at this screenshot. Though the table shows up blank, when I run the query it pulls the correct data from this table, so I know the data is there, it is just not appearing for some reason. I have tried on Access 2013 and 2016 on a different computer, and both have the same effect. I have also tried compacting and repairing, and also exporting the table but the file it exports to also appears blank, aside from the field names. Any ideas on what I could try?
Thanks!
Turn your import into a 2 step process (or more...). Write it raw into a scratch pad table. Then fire an append query, that has the appropriate criteria to result in only valid records going into the final table.
This isn't unusual at all given that some outside data sources may not have the controls to prevent bad records. Sometimes one must "wash" the data with several different query criteria/steps in order to filter out the bad guys.

(Excel-VBA) Specific data import (on the background) in the active sheet

Would you please help me (total beginner) to prepare a VBA macro that would open a sheet on the background and import specific selection as shown below:
Let's say we have downloaded wordcount analysis (xlsx) like this downloaded from a CAT tool for testing.
Now I would need to add a macro to my main sheet that would read lines starting (Column A) with "All". If "All" then I'd need to record columns of that line (specficilly Columns A - O) in array / hashtable?.
Please take a look at this image that summs it all (better than explaining it for me :-)
Let me know in case you need to know more details.
All tips / suggestions are greatly appreciated.
Many thanks!
My suggestion (I'm a beginner too) would be to use the Macro Recorder. Great tool to learn (example).
start recording
filter for 'ALL'
copy/past the Cells
stop Recording
Then have a look at the recorded code and adjust it :)
Looking at your data and the final layout you are looking for, using a Pivot Table would provide you with all of the flexibility you need.
You can:
filter which data to display
generate calculated values based on data in other columns
choose what order your columns are displayed
dynamically change the layout if you decide you want a different view
From your data, I was able to generate the following Pivot Table in about 15 minutes.
There are several good, simple tutorials on building Pivot Tables. A Google search will turn up plenty.
Things you will need to learn about for your particular problem:
Classic display (I used the classic display to get this particular layout)
Calculated Fields (many of the columns in the pivot table are calculated based on your spec). There is a maximum string length of 255 characters for a field calculation, so you may need to rename some of the columns in the original data set.
Of course, basics of Pivot Tables
Loading new data and updating your pivot table
Good Luck!

In SQL how do I update a table with a similar table?

In my current Database I have a table whose data is manually entered or comes in an excel sheet every week. Before we had the "manual entry option", the table would be dropped and replaced by the excel version.
Now because there is data that only exists in the original table this can not be done.
I'm trying to find a way to update the original table with changes and additions from the (excel) table while preserving all rows not in the new sheet.
I've been attempting to simply use an insert query and an update query /but/ I can't find a way to detect changes in a record.
Any suggestions? I can provide the current sql if you'd find that helpful.
Based on what I have read so far, I think I can offer some suggestions:
It appears you have control of the MS Access. I would suggest adding a field to your data table called "source". Modify your form in the access database to store something like "m" for manual entry in the source field. When you import the excel, store an "e" for excel in the field.
You would need to do a one time scrub of the data to mark existing records as manual entries or excel entries. There are a couple of ways you can do it through automation/queries that I can explain in detail if you want.
Once past these steps, your excel process is fairly simple. You can delete all records with source = "e" and then do a full excel import. Manual records would remain unchanged.
This concept will allow you to add new sources and codes and allow you to handle each differently if needed. You just need to spend some time cleaning up your old data. I think you will find it worth it in the end.
Good Luck.

SQL Server 2008 - TSQL Read CSV file

I am working on a project that basically entails on importing a CSV file into a SQL Server 2008 R2 database. The CSV file is generated from an Excel file that is populated by a "manager" with PR hours for his employees. This also includes some additional information such as which job and phase the employees were working on and also includes the number of hours for an equipment (if used).
Once you generate a CSV file for that, it's not exactly the usual straighforward "column" based CSV file. It's more like a "row" based CSV file with each row being kind of unique. Due to this caveat involved, I cannot do a straight dump (using BULK insert or OPENROWSET) to SQL, which would essential create a (temp) table with the appropriate column filled data.
I am looking to use the fields within the CSV file based on the "location" of that field in the row.
So, basically the positions of the data will remain the same, since every CSV is based on a TEMPLATE file - so all I have to do is navigate through the CSV file using SQL code to find the right field based on it's position in the ROW. I hope that gives you guys a better understanding of what I am trying to achieve here. Sorry for the long wall of text.
I researched a bit and here's what I have come up with so far:
Reads CSV files into a temp table through a custom SQL function (Reading lines from a file)
https://www.simple-talk.com/sql/t-sql-programming/reading-and-writing-files-in-sql-server-using-t-sql/
This one is interesting. Dumps the whole file as a BLOB and then you can sift through the data.
http://www.mssqltips.com/sqlservertip/1643/using-openrowset-to-read-large-files-into-sql-server/
Finally, this one essential splits out the rows and creates separates records per row. Interesting..
http://ask.sqlservercentral.com/questions/17408/how-to-read-a-text-file.html
If anyone has any suggestions or steps that I could follow to get through this, I would greatly appreciate it.
To the Mods: If I have posted something (especially the links) that shouldn't be here, please feel free to remove it. I apologize if I did.
Thanks much.. Hope to hear some positive responses! :)
Warm Regards,
Pranav
If the file is not too large, another option is to post-process the file in Excel using a VBA macro. Of course, you'd need to come up to speed using the Excel object model and VBA, but the recording function makes it fairly simple. One advantage of the VBA approach is that it seems you really do want to do row by row processing, and VBA is better for that, whereas SQL is better for set-based operations.