Refesh data in InMemory Mode In SSAS - ssas

I have created a tabular project in "In Memory" Mode. I want to refresh the data on daily basis. Is there any way to schedule a job which will refresh the data on daily basis?

Your options are rather limited atm
You can't automate it directly from portal (you can vote here for such functionality)
You can trigger refresh from azure function
#r "D:\home\site\wwwroot\yourfunccatalog\bin\Microsoft.AnalysisServices.DLL"
#r "D:\home\site\wwwroot\yourfunccatalog\bin\Microsoft.AnalysisServices.Core.DLL"
#r "D:\home\site\wwwroot\yourfunccatalog\bin\Microsoft.AnalysisServices.Tabular.DLL"
using System;
public static void Run(string input, TraceWriter log)
{
var server = new Microsoft.AnalysisServices.Tabular.Server();
server.Connect("Provider=MSOLAP;Data Source=asazure://[...];User ID=[...];Password=[...];Catalog=[...];Persist Security Info=True; Impersonation Level=Impersonate;");
var model = server.Databases[0].Model;
model.RequestRefresh(Microsoft.AnalysisServices.Tabular.RefreshType.Full);
model.SaveChanges();
server.Disconnect();
}
You will have to upload dlls before you run it (In my case c:\Program Files (x86)\Microsoft SQL Server\130\SDK\Assemblies)
Azure automation might be an option if powershell is your thing. use invoke-ASCmd servlet
xmla script should look like this (haven't tested it!)
<Statement xmlns="urn:schemas-microsoft-com:xml-analysis">
{
"refresh": {
"type": "full",
"objects": [
{
"database": "Your database name here"
}
]
}
}
</Statement>
use c# tabular object model

Related

How to start on ServiceStack?

Can you help me point out how should I start on this:
I'm new to API , and I'm currently working on ASP.NET Core 3.1 MVC paired with Microsoft SQL Server. I have requirement that I should use API (ServiceStack) for a certain method.
My question are :
how or where do I start from my existing project solution?
If I use API should it be calling on SQL also? (I supposed the data will stay on db)
with regards to first question : they gave me a link where I can see this.
Where should I start , I'm just so confused.
I've looked up on youtube but there's no similar case to mine, they all use in-memory.
Suggestions and advice are welcome !
Go through ServiceStack's Getting Started Section starting with Create your first Web Service.
Configure OrmLite in your AppHost
To configure OrmLite, start with the OrmLite Installation tells you which package to download whilst the OrmLite Getting Started docs lists all the available SQL Server Dialects which you'd use to configure the OrmLiteConnectionFactory in your IOC.
E.g. for SQL Server 2012:
public class AppHost : AppHostBase
{
public AppHost() : base("MyApp", typeof(MyServices).Assembly) { }
// Configure your ServiceStack AppHost and App dependencies
public override void Configure(Container container)
{
container.AddSingleton<IDbConnectionFactory>(
new OrmLiteConnectionFactory(connectionString,
SqlServer2012Dialect.Provider));
}
}
Using OrmLite in Services
Then inside your ServiceStack Services you can access your ADO .NET DB connection via base.Db which you can use with OrmLite's extension methods, e.g:
public class MyServices : Service
{
public object Any(GetAllItems request) => new GetAllItemsResponse {
Results = Db.Select<Item>()
};
}
Checkout the OrmLite APIs docs for different APIs to Select, Insert, Update & Delete Data.
Creating effortless RDBMS APIs using AutoQuery
As you're new I'd highly recommend using AutoQuery RDBMS since it lets you create RDBMS APIs with just Request DTOs.
You can enable it by adding the AutoQueryFeature plugin in the ServiceStack.Server" NuGet package:
public override void Configure(Container container)
{
Plugins.Add(new AutoQueryFeature { MaxLimit = 100 });
}
Then you can create an AutoQuery API for your Item table with just:
[Route("/items")]
public class QueryItems : QueryDb<Item> {}
Which will now let you query each Item column using any of AutoQuery's implicit conventions, e.g by exact match:
/items?Id=1
Or by any of the query properties:
/items?NameStartsWith=foo
Creating Typed Request DTO
Once you know which Query APIs your client Apps needs I'd recommend formalizing them by adding them as strong typed properties in your Request DTO, e.g:
[Route("/items")]
public class QueryItems : QueryDb<Item>
{
public int? Id { get; set; }
public string NameStartsWith { get; set; }
}
Calling from Service Clients
Which will enable an end-to-end Typed API using any of ServiceStack's Service Clients, e.g:
var client = new JsonServiceClient(BaseUrl);
var response = client.Get(new QueryItems { NameStartsWith = "foo" });
response.PrintDump(); // quickly view results in Console
There's also AutoQuery CRUD APIs that will let you create APIs that modify your RDBMS tables using just Request DTOs.

Application Insights not logging SQL Dependencies queries

I'm using an App service hosted in Azure using Asp.Net Core & .Net5. I turned on SQL dependency tracking using below settings in appSettings.config. But I see SQL dependencies logged without SQL command Text. Is there any other settings to enable to see SQL commands in the log?
"ApplicationInsights": {
"InstrumentationKey": "my guid key",
"EnableDependencyTracking": true,
"DependencyTrackingOptions": {
"EnableSqlCommandTextInstrumentation": true
},
"sampling": {
"isEnabled": true,
"maxTelemetryItemsPerSecond": 5
}
},
Your settings in appSettings.config is for Azure Function, not ASP.NET Core applications.
For ASP.NET Core applications, It is now required to opt-in to SQL Text collection by using
services.ConfigureTelemetryModule<DependencyTrackingTelemetryModule>((module, o) => { module. EnableSqlCommandTextInstrumentation = true; });
For more details, you can refer official doc
Advanced SQL tracking to get full SQL Query

How to seed Hangfire database in asp.net core 2.0

I'm trying to integrate "Hangfire" to my existing ASP.NET Core application. Looking into the documentaion, looks like the Hangfire db needs to be created if we want to use SQL Server storage option. But it would be really annoying to create a db every time we spin up a new server.
Is there a way I could seed DB with the SQL Storage option enabled rather than from startup class like this?
services.AddHangfire(options =>
options.UseSqlServerStorage(Configuration["HangfireConnection"]));
A good solution could be to isolate this concern to some TSQL scripts and run those scripts when the app startup or before running your tests (based on your scenario), but a quick solution that I have used before is something like this (based on this post).
public class HangfireContext : DbContext
{
public HangfireContext(DbContextOptions options)
: base(options)
{
Database.EnsureCreated();
}
}
public class Startup
{
public void ConfigureServices(IServiceCollection services)
{
var connectionString = Configuration.GetConnectionString("HangfireConnectionString");
var optionsBuilder = new DbContextOptionsBuilder<HangfireContext>();
optionsBuilder.UseSqlServer(connectionString);
var hangfireDbContext = new HangfireContext(optionsBuilder.Options);
services.AddHangfire(config =>
config.UseSqlServerStorage(connectionString));
}
}

TPL Task in WCF service fails to use correct IIS security Credentials (SQL Connection)

I have a WCF service method that calls a SQL stored proc. I'm developing using IIS 5 (can't do much about that, II6/7 not available)
To get some gains, I'm doing a number of async calls to this stored proc by putting the call into a c# TPL Task.
When run as a Task, I'm getting an SQL Exception...
"Login failed. The login is from an untrusted domain and cannot be used with Windows authentication"
However, If I run the exact same process without using a Task, I have no problems with SQL connection
It would appear to me that the credentials for the IIS Virtual folder (WCF) are not being delegated to the Task? Any ideas how I can specificy credentials for the TPL Task thread, ie to use the same as the parent etc ?
I am using Windows Authentication (sspi), and impersonation to be able to connect to the seperate SQL box.
Your help appreciated.
You have two choices.
1) Opt your entire application into always flowing the identity using:
<runtime>
<alwaysFlowImpersonationPolicy enabled="true"/>
</runtime>
This has a side effect of overhead and the danger of accidentally executing some unintended code with the priviledges of the currently calling user rather than the application identity. I would personally avoid this and go with #2 where you explicitly opt-in.
2) Capture the WindowsIdentity before setting up your TPL tasks and explicitly impersonate where you need to make the calls using Impersonate + WindowsImpersonationContext:
public void SomeWCFOperation()
{
WindowsIdentity currentIdentity = WindowsIdentity.GetCurrent();
Task.Factory.StartNew(() =>
{
// some unpriviledged code here
using(WindowsImpersonationContext impersonationContext = currentIdentity.Impersonate())
{
// this code will execute with the priviledges of the caller
}
// some more unpriviledged code here
});
}
As another workaround, you can create extensions to the TPL as follows:
public static class TaskFactoryExtensions
{
public static Task StartNewImpersonated(this TaskFactory taskFactory, Action action)
{
var identity = WindowsIdentity.GetCurrent();
return taskFactory.StartNew(() =>
{
using (identity.Impersonate())
{
action();
}
});
}
public static Task<TResult> StartNewImpersonated<TResult>(this TaskFactory taskFactory, Func<TResult> function)
{
var identity = WindowsIdentity.GetCurrent();
return taskFactory.StartNew<TResult>(() =>
{
using (identity.Impersonate())
{
return function();
}
});
}
}
You would then call these new methods in place of the standard StartNew methods.
The downside to this is that there are a lot of methods to override.

Could not load middleware layer 'com.sap.mw.jco.rfc.MiddlewareRFC'

I'm using Sap Jco to connect to SAP database with the front end being Java(JSF), When I connect to SAP with:
try {
mConnection =JCO.createClient("400", // SAP client
"c3026902", // userid
"********", // password
"EN", // language
"iwdf5020", // host name
"00"); // system number
mConnection.connect();
}
catch (Exception ex) {
ex.printStackTrace();
System.exit(1);
}
Problem I'm facing is when run the application for the first time, data is displayed but when I re-run it says "Could not load middleware layer 'com.sap.mw.jco.rfc.MiddlewareRFC' "
Can any one help me in resolving the issue?????
This sounds like the API cannot load the native driver files.
The SAP Java Connector consists of a native runtime part, that does the actuall communication and a Java API that wraps this functionality with a java api.
The Java API is inside the sapjco.jar and the native drivers are e.g on windows inside librfc32.dll and sapjcorfc.dll.
Place these dll's into your system path (e.g. windows: C:\WiNDOWS\system32) and it should run.
Cheers
Sebastian
Are your DLLs located in the Windows system32 folder? If so, are you probably using the wrong architecture? (x64 DLL on 32 bit or vice versa)
Also, are the DLLs the same version as the java api? If you have SAP GUI installed there could be older DLLs around.
Defining SAP connection:
For the Version 3,0 of the sapjco library there exists plenty of useful information. To create a connection following the instructions in:
http://www.browseye.com/linkShare.html?url=http://help.sap.com/saphelp_nwpi711/helpdata/en/46/fb807cc7b46c30e10000000a1553f7/content.htm?bwsCriterion=%22Setting%20Up%20Connection%22&bwsMatch=1&bwsCriterion=%22Setting%20Up%20Connection%22&bwsMatch=1
There are a few thing that you should take into account:
Place the dll file in the same place that the jar.
The dll must be the right version for your operating system and architecture otherwise you will get a native library error.
Example of code to create a connection to the server.
public class StepByStepClient
{
static String DESTINATION_NAME1 = "ABAP_AS_WITHOUT_POOL";
static String DESTINATION_NAME2 = "ABAP_AS_WITH_POOL";
static
{
Properties connectProperties = new Properties();
connectProperties.setProperty(DestinationDataProvider.JCO_ASHOST, "ls4065");
connectProperties.setProperty(DestinationDataProvider.JCO_SYSNR, "85");
connectProperties.setProperty(DestinationDataProvider.JCO_CLIENT, "800");
connectProperties.setProperty(DestinationDataProvider.JCO_USER, "homofarber");
connectProperties.setProperty(DestinationDataProvider.JCO_PASSWD, "laska");
connectProperties.setProperty(DestinationDataProvider.JCO_LANG, "en");
createDestinationDataFile(DESTINATION_NAME1, connectProperties);
connectProperties.setProperty(DestinationDataProvider.JCO_POOL_CAPACITY, "3");
connectProperties.setProperty(DestinationDataProvider.JCO_PEAK_LIMIT, "10");
createDestinationDataFile(DESTINATION_NAME2, connectProperties);
}
static void createDestinationDataFile(String destinationName, Properties connectProperties)
{
File destCfg = new File(destinationName+".jcoDestination");
try
{
FileOutputStream fos = new FileOutputStream(destCfg, false);
connectProperties.store(fos, "for tests only !");
fos.close();
}
catch (Exception e)
{
throw new RuntimeException("Unable to create the destination files", e);
}
}
public static void step1Connect() throws JCoException
{
JCoDestination destination = JCoDestinationManager.getDestination(DESTINATION_NAME1);
System.out.println("Attributes:");
System.out.println(destination.getAttributes());
System.out.println();
}
}
In SAPJco 3.0 connections are build from the info contained in a “Destination”.
The documentation example use a properties file to save the “Destination”. However it is a non-secure way to keep connection info. As is indicated on the documentation in the hightlighted paragraph you can see on next link.
http://help.sap.com/saphelp_nwpi711/helpdata/en/48/5fb9f9b523501ee10000000a421937/content.htm?bwsCriterion=%22In%20practice%20you%20should%20avoid%20this%20for%20security%20reasons.%22&bwsMatch=1
You can keep connection info on a database or any other storage system if you create a custom “DestinationDataProvider” In the Examples provided with the SAPJco library there is an example of how to create a custom DestinationDataProvider.