DBACOCKPIT tcode on two SAP systems look different, why? - abap

I have 2 different systems with SAP installed on them. First installation running on SQLServer, and the other installation running on Oracle.
In the first installation of SAP running on SqlServer, when i run DBACOCKPIT tcode, i get the following subfolders;
Performance, Space, Backup And Recovery, Configuration, Jobs, Alerts, Diagnostics, Download.
However, on the second installation of SAP running on Oracle. I get the following sub folders only: Performance, Space, Jobs, Diagnostics.
Why don't i get the other folders?
Both systems run ECC 6.0
SAP Basis components of both are a bit different:

Despite the difference of DBACOCKPIT layouts between MS SQL system (A) and Oracle system (B), I think it is normal. Possible reasons for this are:
lack of necessary user authorizations on system B
different set of authorizations of user A and user B
Oracle database was configured incorrectly or was not configured at all
SAP Note 1028624 regarding Oracle DBACOCKPIT says
Some performance monitors within the DBA Cockpit require special database objects. These objects are created using an SQL script. See Note 706927 for this script and more information.
Some functions in the DBA Cockpit require the Oracle Active Workload Repository. This Oracle option must therefore be available for selection (see Note 1028068).
This note also exactly specifies set of functions available in DBACOCKPIT for Oracle, and this set fully corresponds to your screenshot.
The DBA Cockpit has a navigation area that is visible in all the functions of the DBA Cockpit. This area contains a menu tree with the following access points:
- Performance (corresponds to the old transaction ST04)
- Space (corresponds to the old transaction DB02)
- Jobs (corresponds to the old transactions DB13, DB12, DB14, DB13C)
- Diagnostics
At the same time MS SQL DBACOCKPIT also corresponds to SAP standard, which is confirmed by SAP Note 1027512 and by this datasheet, for example.
Possible steps for research:
check if authorizations S_RZL_ADMIN and S_BTCH_ALL are assigned to you, as it stated in detailed DBACOCKPIT description from note 1028624.
check SAP Database Oracle Guide for compliance with you system B setup

This is absolutely standard situation.
I am also have a two systems with the same version of ECC.
One on MS SQL Server 2014
Another one on SAP HANA SPS03
and
DBACOCKPIT t-code looks different in these two systems.
All OK!

Related

check permissions in a database

we are using SQL2012 Enterprise Edition and have a small problem with one of our DBA (5 persons with more than 100 instances). is it possible to check who has changed given, denied or revoked permissions and when this was done? I could not find any trigger or audits.
Thanks for your help and best regards from HAMBURG in GERMANY
Two choices spring to mind here:
Use SQL Server Audit, and create audit specifications to capture the events that you are interested in (see http://technet.microsoft.com/en-us/library/cc280386.aspx for more details)
Use server level triggers to capture all DDL events (see http://www.mssqltips.com/sqlservertip/2085/sql-server-ddl-triggers-to-track-all-database-changes/ for an example of how to do this)
We basically use the second approach across our production instances for audit purposes, and to record a complete history of all schema changes across all databases so that we can accurately identify what has changed (and by whom, and when) to assist with production troubleshooting of issues.

How do I create a query use log in VB.NET? [duplicate]

I'm using Oracle 11g Standard Edition.
I would like to log all SQL queries, that are being executed by users, into a table.
How can this be done?
If you're using a modern version of the database (9i or later) and you have an Enterprise Edition license you can use Fine-Graining Auditing. It allows us to audit user queries at a very low level of granularity, through defined policies.
To capture SQL text and bind variables you will need to set the AUDIT_TRAIL parameter appropriately when adding an FGA Policy. Find out more.
"i'm using an 11g standard, so auditing functions are not supported."
Not exactly. The AUDIT command is part of the standard Oracle build, but it only allows us to capture when a given user issues a SELECT against a given table. But, yes, to find out exactly what they are selecting requires Enterprise Edition license.
Also there is no ON SELECT trigger, so we cannot roll our own.
"So can i use AUDIT command in the standard edition? ... But then a
consultant told me, that i cannot use it without paying enterprise
license? "
Speaking as a consultant myself, I do have to say those guys don't always know what they are talking about.
So let's be clear:
the AUDIT command is part of Oracle SQL. It is usable with the Standard Edition. In fact since 11g it is enabled by default. It audits general activity. Find out more.
Fine Grained Auditing is a PL/SQL package with is only usable if you have the Enterprise Edition. It allows us to audit user activity at a very low level. Find out more.
For QUICK, EASY logging of SQL, try my monitoring answer here. Not for long-term logging, but works great just to see what is going on in a small time window. :-)

SQL Server 2012 Enterprise vs Standard - use of partitioning?

I have been developing against SQL Server 2012 Enterprise, and came to migrate to production, where I found our hosting provider had installed Standard. I didn't think it should be a problem, as I hadn't implemented any enterprise specific features. However when I restored the DB it failed to activate, and in the Event Log, I found a message indicating the database couldn't be activated because it contained features not supported by the version. When I dug deeper, I found that it appeared that FTS or some other function had automatically created 5 partition functions and schemes.
I then went through a time consuming process to remove the partitions functions and schemes, and could successfully restore the database on the Standard edition.
After a while I backed up the DB (with no PFs or PSs), transferred it to my dev env, restored it (on SQL Enterprise), and after some time I found that a single partition function and scheme had been created. When I next came to backup and restore to prod, this time the database activated ok without error - even though there were partition functions and schemes.
I have just run the following:
SELECT feature_name FROM sys.dm_db_persisted_sku_features ;
from here
http://msdn.microsoft.com/en-us/library/cc280724.aspx
and found that for the db with 5 partitions functions/schemes, Partitioning is listed as a version specific feature. When running the same against the db with 1 function/scheme, it's not listed.
Is there something going on here that Auto created, FTS related partition schemes are ok on standard edition, but not manually created/other types? (keep in mind I never manually implemented partitioning)
Based on MSDN article Features Supported by the Editions of SQL Server 2012:
Only Enterprise version supports Partitioning of tables and indexes. It means that if any table or index is partitioned, then it cannot be imported into any other version. However a partition scheme and partition finction can exist without being used by any table or index. In this case import succeeds, as there are no partitioned tables or indexes.
Moreover, the RDBMS Managebility section tells, that Distributed partitioned views are supported by all versions to some extent. Thus permitting partition schemes and functions to exist as definition in all versions.

Choose Database for multi-user applications with only a file server (windows)

I need to choose a database solution for multiple simultaneous users of windows based applications using the same database on a file server.
I need a database that can live on a Window OS file server.
Must be shared by several applications running on individual MS
Windows machines (mostly Windows 7)
Made available by file server.
Cannot use a database server/engine (due to internal political rules) or
webpage server.
Prefer using C# for a set of WPF applications.
Currently using set of VB applications with a set of MS Access
files - one of these applications has problems and needs a re-write.
Current set of about a half-dozen *.mdb files (some with link
tables) are about 400 MB. Growth at a guess of 10 to 20 MB/year.
Up to about a dozen concurrent users each on their own PC
currently. Don't expect much change on this in the future.
All apps both read and write data to the database.
Currently several (about 4 people) write ad hoc queries in Access -
they will continue to need to be able to write queries somehow.
Would like to prevent changes in database structure (adding
tables/columns) by end users.
Free software.
The choices I know about are:
Access .mdb files (the current situation).
SQLite.
SQL Server CE.
Are there other systems that might work that fit many or all of the desired characteristics? Are there particular "gotchas" I should know about for the systems that I am considering?
Well, "Cannot use a database server/engine" makes things harder. So does "free".
I think Access is the only thing in your list that comes close to fulfilling all the requirements. It's not free, but it seems you already have it, so at least it doesn't cost extra.
Access is essentially three different, bundled products.
Jet database engine
RAD environment for queries, forms, and reports
VBA programming environment
If you're using only the database engine, it makes sense to do some testing with SQL Server CE.
Switching to SQLite would probably require additional checks in application code. SQLite supports storage classes, not data types. What does that mean? It means SQLite allows this.
sqlite> create table foo (n integer);
sqlite> insert into foo values ('wibble');
sqlite> select n from foo;
wibble
HyperSQL is another possibility. Supports only JDBC, might run without a server component. (Docs weren't immediately clear about that.) I think it would require a lot more work to switch to this than to SQL Server CE.
See also H2 and Firebird.

How can I maintain consistent DB schema accross 18 databases (sql server)?

We have 18 databases that should have identical schemas, but don't. In certain scenarios, a table was added to one, but not the rest. Or, certain stored procedures were required in a handful of databases, but not the others. Or, our DBA forgot to run a script to add views on all of the databases.
What is the best way to keep database schemas in sync?
For legacy fixes/cleanup, there are tools, like SQLCompare, that can generate scripts to sync databases.
For .NET shops running SQL Server, there is also the Visual Studio Database Edition, which can create change scripts for schema changes that can be checked into source control, and automatically built using your CI/build process.
SQL Compare by Red Gate is a great tool for this.
SQLCompare is the best tool that I have used for finding differences between databases and getting them synced.
To keep the databases synced up, you need to have several things in place:
1) You need policies about who can make changes to production. Generally this should only be the DBA (DBA team for larger orgs) and 1 or 2 backaps. The backups should only make changes when the DBA is out, or in an emergency. The backups should NOT be deploying on a regular basis. Set Database rights according to this policy.
2) A process and tools to manage deployment requests. Ideally you will have a development environment, a test environment, and a production environment. Developers should do initial development in the dev environment, and have changes pushed to test and production as appropriate. You will need some way of letting the DBA know when to push changes. I would NOT recommend a process where you holler to the next cube. Large orgs may have a change control committee and changes only get made once a month. Smaller companies may just have the developer request testing, and after testing is passed a request for deployment to production. One smaller company I worked for used Problem Tracker for these requests.
Use whatever works in your situation and budget, just have a process, and have tools that work for that process.
3) You said that sometimes objects only need to go to a handful of databases. With only 18 databases, probably on one server, I would recommend making each Databse match objects exactly. Only 5 DBs need usp_DoSomething? So what? Put it in every databse. This will be much easier to manage. We did it this way on a 6 server system with around 250-300 DBs. There were exceptions, but they were grouped. Databases on server C got this extra set of objects. Databases on Server L got this other set.
4) You said that sometimes the DBA forgets to deploy change scripts to all the DBs. This tells me that s/he needs tools for deploying changes. S/He is probably taking a SQL script, opening it in in Query Analyzer or Manegement Studio (or whatever you use) and manually going to each database and executing the SQL. This is not a good long term (or short term) solution. Red Gate (makers of SQLCompare above) have many great tools. MultiScript looks like it may work for deployment purposes. I worked with a DBA that wrote is own tool in SQL Server 2000 using O-SQl. It would take an SQL file and execute it on each database on the server. He had to execute it on each server, but it beat executing on each DB. I also helped write a VB.net tool that would do the same thing, except it would also go through a list of server, so it only had to be executed once.
5) Source Control. My current team doesn't use source control, and I don't have enough time to tell you how many problems this causes. If you don't have some kind of source control system, get one.
I haven't got enough reputation to comment on the above answer but the pro version of SQL Compare has a scriptable API. Given that you have to replicate stuff to all of these databases you could use this to make an automated job to either generate the change scripts or to validate that the databases are all in sync. It's also not much more expensive than the standard version.
Aside from using database comparison tools, with 18 databases you should have a DBA, so enforce a policy that only the DBA can change tables at the database level by restricting access to CREATE and ALTER to the DBA only. On both your test and live databases. The dev database shouldn't have this, of course! Make the developers who have been creating or altering the schemas willy-nilly go via the DBA.
Create a single source-controlled DDL/SQL script for each release and only use it to update the databases. The diff tools can be useful but mainly for checking that you haven't made a mistake and getting out of trouble when the policies fail. Combine the DDL, SQL, and stored procedure scripts into a single script so that it's not easy to "forget" to run one of the scripts.
We have got a tool called DB Schema Difftective that can compare and sync database schemas. With our other tool, DB MultiRun you can easily deploy generated (sync) scripts to multiple db servers (project based).
I realize this post is old, but TurnKey is correct. If you are a developer working in a team environment, the best way to maintain a database schema for a large application, is to make updates to a Master Schema in what ever source safe you use. Simply write your own Scripting class and your Database will be perfect every time.