We have some old reports which are not being used anymore by the business and wish to remove/archive these.
I have 2 queries related to this:
a) What is the best way to find if a report has not been used for the past 12+ months?
b) Is there any simple way of moving the reports no longer being used (not used for > 12 months) to a different location (i.e. new folder), while keeping the folder structure intact?
We have searched for solutions on the web, but have not been able to find an automated solution for this. as the number of reports which we have figured out is in the range of ~5000, we are searching for an automated way to work this out.
Would running an SQL query on the server (physical machine) be advisable? If we run this query on the content store, we wish to figure out the column/field on which the actual report lies (in the below query have used , but not sure if such a field or query can be used):
update <table> set <report_path>='/content/folder[#name='Home']/folder[#name='Report']/report[#name='ABC012 - My Report']' where <report_path>='/content/folder[#name='Home']/folder[#name='Archive_test']/report[#name='ABC012 - My Report']'
Would this kind of a query work?
If not, can anyone suggest a way on which we can move reports to a single folder on the same Cognos box? (we are using Cognos 10, with DB2 and Netezza)
I can help you with a)
http://pic.dhe.ibm.com/infocenter/cfpm/v10r1m0/index.jsp?topic=%2Fcom.ibm.swg.im.cognos.ug_cra.10.1.0.doc%2Fug_cra_id4425SampleAuditReports.html
You can use provided package and sample reports to find information you need.
And for b) you should look in Cognos SDK. I don't think that update tables directly is a good idea.
Related
I have tried to read up on this topic and I am still a bit unclear how to proceed. This seemed like a fairly basic task but it has been nowhere as simple as I had assumed. I have several SQL queries written and I want to be able to schedule them to run on a certain day each month and then automatically be exported to a .csv file in a selected folder. This will then allow them to be automatically uploaded into a BI and reporting tool that our firm uses (this part I know how to take care of).
I am fairly well versed in the writing of SQL queries, but everything beyond that I am pretty lost on. Right now I am using Microsoft SQL Management Studio 17. I thought that maybe scheduling jobs using the SQL Server Agent would be the solution, but the more I read about that and go down that path, the less I am convinced that it will allow me to export the query results into the .csv file that I need for it to be picked up. It is also important that these results are exported without headers.
Does anyone have any solutions for this? I am happy to answer any follow up questions if I am at all unclear.
You can create a job within the SQL server management studio to handle the whole thing.
http://social.msdn.microsoft.com/Forums/en/transactsql/thread/7d2280cf-3b33-46f7-ba82-4131e8a841c0
I need to find all the issues discovered in a snapshot/scan in Sonarqube. I can't use the web API since the volume can be excessive for new projects on first scan. I have a query that can find the latest snapshot with the project information. I can query issues by project. I can't figure out how to relate issues to a snapshot. There has to be a way since Sonarqube does it - New issues on the Project page.
Has anyone done this or have enough experience with the crazy schema to be able to figure it out? Can't wait for the schema rationalization...
Sonarqube 5.6.3 on Windows 2012 R2 with SQL Server 2012.
There is currently no association between snapshot and issue. Nor has there ever been one. The closest you can come is to use date parameters to narrow the set of issues created right around the time of your analysis. Note that this could be difficult if you run analyses close together.
The "new issues" metrics shown on the project homepage are just that - metrics. However, if you click through on one, you'll find yourself in a date-based Issues search.
You can do the same sort of thing using the web service, again, via date-based criteria. Or you could use the sinceLeakPeriod parameter.
I have looked at a few stackoverflow forum posts but nothing fits (or atleast I dont think so) what I need help with.
I'm looking for general advise, my company has 'tasked' me to look at moving some data from tables stored in our parent companies databases into a database of our own that has all the information we need in one place.
For instance if we want information that related to one thing, we may have to pull data from several different databases. I think I can get my head around the moving of the data and create a sql query to do it, however we're currently using SQL express as our SQL db (the company is more than happy to buy/create a SQL server but as far as we can see SQL express does what we need it too (feel free to correct me)).
I need to look at scheduling the data move for every hour or few hours to keep the data 'up to date' for when reports are generated using the data.
I have looked at a few programs but the as the queries and the database is on a server 2008 r2 system some of the 'programs' don't like it as they were last updated pre 2010 etc. I have also installed SQL management suite 2012 due to SQL server agent but I cant even get that worked (Service is enabled and I have restarted the DB just still nothing within suite).
I'm not looking (however happy to take the help) for a 'Do this and that and that' type reply but more than happy to accept that amount of help but if you guys / gals can point me in the right direction.
Summary:
-Combining data already on databases from our parent company into a table / DB of our own making
-Currently using SQL Express but willing to upgrade to something else that does the job
-Schedule the data moves for every X hours (Windows scheduling?)
-automating the entire thing so don't have to manually do the moves.
Help on any of the points above would be greatly appreciated and I would 'love you long time' for the help.
JB
There are a bunch of limitations for SQL Express. One of them is that SQL Agent is not supported. SSIS like SQL Agent is not supported.
http://msdn.microsoft.com/en-us/library/cc645993.aspx
Do not fret, you can always schedule a job with Windows Scheduler.
http://windows.microsoft.com/en-US/windows/schedule-task#1TC=windows-7
As for moving the data, it is up to you to select a solution.
1 - Write a PowerShell application to perform the Extract, Translate, and Load (ETL).
http://technet.microsoft.com/en-us/library/cc281945(v=sql.105).aspx
2 - Use the SQLCMD to perform logic like calling stored procedures.
http://technet.microsoft.com/en-us/library/ms162773.aspx
3 - Use BCP to dump and load data.
http://craftydba.com/?p=1245
http://craftydba.com/?p=1255
http://craftydba.com/?p=1584
http://craftydba.com/?p=1690
It is funny how youngsters think they need to spend a-lot of $ to create a solution for a business.
However, Microsoft does supply you with a-lot of free tools.
You just have to put them together for a solution.
PS: I remember about 10 years ago I created a custom ETL solution using VBSCRIPT. Unlike power shell, it is installed on old and new programs.
Good luck!
You can create a console application which executes that particular stored procedure which handles your logic. ( http://dotnet.dzone.com/articles/basics-stored-procedures-net )
Of course using SSIS is much easier but it's not available in SQL Server Express Edition.
I think you should have a look at Integartion Services, which is not available for Express Edition. Have a look at this article to get started with SSIS.
I'm a programmer (mostly C++) who has moved into a non-software workplace. However, I don't have much experience with database stuff at all.
TL;DR: If we compare Crystal Reports to just writing scripts that execute SQL queries and parse the results, is there anything that CR can do that isn't possible via SQL queries & scripts? I'm talking purely in terms of extracting data - not making pretty documents.
Detail:
At my workplace they have a process where you run a bunch of Crystal Reports, modify the date range to the current month, manually export each to excel, delete the rows and columns that aren't needed, and then cut and paste into a summary excel document that is used by management.
To me, this is pretty crazy and stupid. I'd like to automate/script most of it.
So I have two options:
Learn Crystal Reports and try to modify the existing reports to be more automated.
Dump CR and just learn SQL and do the whole thing programmatically with scripts working with CSV files or something.
I'd much rather learn SQL since it's more general and useful. But I need to be assured that I can get the data output that I need (without writing a million lines of code to reproduce CR myself.)
So yeah, I'm looking for an answer like, "The two are equivalent. Anything you can do in CR you can do easily via scripts and SQL," or "If you need to group records into categories based on a parameter and then sum their one of their fields, then CR will do it much more easily than raw code," to push me in one direction or another.
Edit:
Some additional detail. At the moment my crystal reports run a database query, and then crystal does things like, "don't display the records that are returned, instead group the records by Field A and then display the count of how many records in each group."
Is functionality like this difficult to reproduce via SQL coding? I wouldnt want to have to write a python (or whatever) script to parse and manipulate the data from plaintext CSV, for example.
You can't just compare SQL and CR - they have different purpose. SQL (in this context) is data source, CR is pretty output formatter. For excel you would need data, not formatted output. Excel combined with SQL can give you all CR options (dynamic crosstab reports, charts etc) what you can't get directly from SQL data.
BTW, creating SQL views or procedures is often needed to overcome CR limitations; from this standpoint SQL has lot of more options than CR.
I personally would go with SQL+Excel route. In our company we're using simply SQL+CR without postprocessing, sometimes SQL+Excel. Our customers are using different approaches.
But like said by other people, choice of tools depends on more things. Who has to redesign reports? Who will maintain these reports? How often requirements change? Are there more uses for CR reports besides sourcing Excel tables? Who will be waked up at night, if reports do not work?
Management perpective:
In many I will say mostly cases management does not know SQL. So if a manager for E.g.HR wants to know staus about something then how he will get that status?? This is where Crystal reports come into picture, Using crystal reports they do not have to worry about SQL; they will just enter required fields and get their data.
Programmer perspective:
Simple data outputs can be achieved through SQL but consider a scenario where you need to pull details as well as summary. I agree it can be done via SQL but consider the overhead of time and proficiency required to develop such output using sql. I bet it wont be that easy to develop such output using sql as compared to crystal. So I will say learn both SQL and crystal, you will get to choose the tool to apply for your requirement.
You can write SQL and drop it into the Crystal Report. Best of both worlds, and possibly faster performance than the drag-and-drop Crystal functionality.
You will see some response time lag when the report runs.
There are actually a few things that Crystal Reports can do that are very tricky using plain SQL Queries as Crystal Reports can access the entire dataset in a single formula and can do things at runtime.
However unless you have some really crazy complex Crystal Reports I would recommend building a tool in Excel that can one click the info straight into a new sheet.
I did this and it got me a promotion, not kidding :P
I have a custom Excel Addin I can give you code to that basically does this:
On open, connects to the database and downloads a list of menu options connected to views and procedures
Adds these menu options into a new Ribbon tab within Excel
When one is clicked, runs the view and dumps the entire dataset (properly formatted) into a new sheet
Advantage of this is you can update the main menu list and each view it references without making any changes to the file or re-issuing anything to everyone.
Crystal could be helpful if you want to create a document with a specific layout , logos etc. and show some data on it. Export to excel from Crystal repot is not easy - usually there are a lot of empty columns and rows and each report should be tweaked to avoid that.
If you need to export some data from a SQLServer database to excel your best option will be SSIS ( I guess you have a license for SQL Server). If you don't have license for SSIS or you are using for example Access database there are also some inexpensive tools, which can retrieve data from any database ( not just SQLServer) and export it to excel. I would suggest you to check this one: http://www.r-tag.com. It can run Crystal reports and SQL reports so you can start using your crystal reports immediately and start transforming them to SQL reports whenever you have time for that. Both reports could be exported to excel.
i fixed this by editing excel sql, Left(Column_maxLength, 250)
this resolved my issue
in my case if even if i read left 250 character is enough
I have a 2 MS SQL 2005 databases,a TEST and DEV database. Now our developer added some extra columns,tables etc in the DEV database.This created differences in the TEST database.is there a script i can write tha can tell me what the changes where in the DEV database between certain dates...i found a couple of tools but they are quite basic and dont really generate change scripts etc. Also tried the change script function in management studio but it seems to be working when the change is first made and not later.
Appreciate your thoughts.
A.
redgate SQL Compare is a good tool to check for differences between databases and even sync them.
However, if you need to do this for free, try this: within in SQL Server Management Studio do the following:
1) script the entire schema of TEST to a file, look here if you don't know how
2) repeat step #1 but for the DEV database
3) diff the files using something like Beyond Compare, should have a 30 trial
I wrote dbscript, and one of its features is to compare two database schemas and create a migration script.
What you need is a way to manage changes to your database schema and then apply them in a controlled and consistent manner. Moreover, you need a single authoritative source for the database schema.
For all this, check out Wizardby:
(source: googlecode.com)