On my application server of Microsoft Dynamic Nav 2016 create lots of dump file*.dmp,*.dw and *.txt while taking full memory bu IIS Worker Process - iis-6

IIS Worker Process take full memory and lots of dump file is created on "C:\Users\Microsoft Dynamics NAV 2016 Web Client Application Pool\AppData\Local\Temp"
*.txt
"*.dw
Version=131072
General_AppName=Microsoft Dynamics NAV
FilesToKeep=C:\Users\MICROS~1\AppData\Local\Temp\36144.940.637829576693681644.txt
FilesToDelete=C:\Users\MICROS~1\AppData\Local\Temp\36144.940.637829576693681644.dmp
EventType=DynamicsNavServiceException
P1=Microsoft.Dynamics.Nav.Client.WebClient
P2=9.0.43402.0
P3=562a359b
P4=Microsoft.Dynamics.Framework.UI.WebBase.dll
P5=9.0.43402.0
P6=562a357f
P7=282
P8=0
P9=System.NullReferenceException
UI LCID=1033
UIFlags=1
ReportingFlags=2048"

Related

How window task manager group processes under an application?

As titled, I am wondering how window task manager group processes into 1 application name. Please see below picture for what I want to do.
I have checked out this post. Therefore, I have 2 questions.
How to have an application process to house rest of the processes (including background process) in the task manager such that I can monitor the performance of the application at all
How to rename that root application, and the icon since my application is spawning some python processes at background etc. I would like to name them as "company product" instead of "python" in the task manager.
Any guidelines will be appreciated.
Some more background:
In our application, we spawn many processes (python, postgres, redis etc)
Currently, we bundle the entire application as window application via inno setup
The actual start up process:
Launch a python script (__main__.py)
Create the necessary UI components
Create background processes like redis, postgres etc

Nintex / SharePoint upgrade

My team is embarking on moving from SharePoint 2010 to SharePoint 2016 with Nintex. They want to move content on an individual basis.
However, we also need to move running workflows and keep these intact.
What's the process of moving Nintex workflows and lists from SP2010 to a SharePoint 2016 environment?
Need to ensure the workflows/lists remain with the correct status
Thanks
You cannot move running workflows in SharePoint. The best suggestion is to pause or end the workflow, save the status and then kick them off after you move them. I recommend trying to complete whatever process is running before moving that content or workflow. This means scheduling the migration around business processes or informing your users that SharePoint will be down for a period of time. You can also copy the workflow and list over to the new site and have users start using that one while you wind down the old site. A tool like Sharegate is good for this type of stuff.

MS Dynamics 365 Business Process Flow Visual Designer Output

I have created a "project"-entity Business Process Flow in MS Dynamics 365 Visual Process Designer. How do I now output it into a live project interface which users would interact with? I get the sense this can either be done in Sales-> Opportunities or Project Service Automation -> New Project but no areas I see seem to mention my active business flow to output from.
Other Microsoft resources (examples linked below) describe the Process creation steps fully but skip the "export" step and immediately drop into the live project interface.
Research links:
A: https://technet.microsoft.com/library/mt826697.aspx
B: https://technet.microsoft.com/library/dn531067.aspx
C: https://us.hitachi-solutions.com/blog/dynamics-365-roadmap-a-complete-guide-to-dynamics-365-business-process-flows/
D: https://learn.microsoft.com/en-us/dynamics365/customer-engagement/customize/create-business-process-flow
Partial answer to my own question ... after I deactivated all the other default "project"-entity processes (specifically the standard "Project Phases" process), Dynamics Project Service automatically chose my active custom process when I created a new project.
It seems odd that only one active process per entity type is allowed, and there are no choices for user to output from other templates.

Getting Search Server to ignore sharepoint document data, and speed up crawl times

Background:
I have a Sharepoint Foundation 2010 installation that is being used to store scanned images of paper documents, making an electronic version of paper file folders we keep for each of our company's Clients. All of the documents are all stored as PDF files.
The configuration includes a web-server housing Sharepoint and the Search Server 2010 Express service, as well as separate database server housing the content data as well as the search crawl store. Both the Sharepoint/Search box, and the SQL box are VMware VMs running on shared hosts (including a shared SAN) with our other production servers.
Each file added to sharepoint must be added through a custom interface, including metadata tags for client information (a site content type with a set of site columns defines this extra metadata). We then expose this client identifying data with the search server by setting Managed Properties so we can do queries against the search webservice specifying WHERE CustomClientID = X.
Our data currently resides in two large document libraries, one for each arm of the company.
After a few years of operation our server now has some 250,000 documents and we are having issues with full crawls (running weekly off hours) sometimes crashing part way through, and our incrementals (running every 5 min during work hours) take 7-8 minutes to pick up 2-3 new files.
Question:
I was wondering if there was a way to get the search server crawler to only pick up the metadata we are supplying and ignore the document contents entirely, which I assume would speed up the crawl process by orders of magnitude. I believe this feature is described as full text search, but have not been successful in finding anything that explains if this is something that can be turned off.
If not, is there an alternative option for speeding up crawl times that anyone would advise?

How to force process isolation for an out of process COM server?

I'm writing managed code that has to interact with a vendor's COM automation server that runs out of process. I have found that this server becomes unstable if more than one client connects to it. For example, if I have managed code in process A and managed code in process B both connecting to this COM server, a single process will be spun up for the COM server and it's behavior is less than reliable. I'm hoping there's a way to force a separate process for each client - server connection. Ideally, I'd like to end up with:
Managed Process A talking to COM Server in process C1
Managed Process B talking to COM Server in process C2
One thought that came to mind was that if I ran process A and process B with different security identities, that that might cause the COM infrastructure to create separate server processes. I'd rather not go down that road, however. Managed Process A and Managed Process B are actually Windows Services. And I'm running them with identity Local System (because I need them to be able to interact with the desktop, and you can't check the "Interact with Desktop" box on the services applet for services that don't run as Local System). And the reason I need to interact with desktop is that this COM server occasionally throws up a dialog box on the screen and if the service itself cannot interact with the desktop then the COM server is spawns can't display the dialog (I believe it is displayed on a hidden WinStation).
Place the component registered at COM+, this put an isolation layer at your.
Use : Control Panel->Administrative Tools
or cmd/execute DCOMCNFG
Component Services->Computers->My Computer->COM+ Application, right click, new application, next, Create an empty application, enter app name “COM+ your.dll”, next, select Local Service, next, next, next, finish.
In new item made, expand, at Components, right click, new component, next, select Install new component, select your component.
Click Component properties, tab Identity, select System Account.
For errors in calls see Event after.
It's been a while since I've done this, so my memory is hazy.
If you configure the OOP COM server as a DCOM server using the DCOM config tool, I believe you can specify the isolation level. I did this years ago with a non-threadsafe in-process DLL that needed to be accessed in a threadsafe fashion from IIS and it worked a charm.
Let me know if it works for you :)
Your best bet would be to get the vendor to fix the component. After all, if it won't handle multiple clients, there could be other bugs lurking. If you can't do this, there are some other things to try.
With in-process COM objects I've had occassion to manually load the dll and access the interfaces directly without going through COM.
I haven't done this myself with out-of-process COM, but there are some things you could try. Ultimately the library is just a process sitting there receiving messages which invoke functions.
You might be able to manually start the a new copy of the process for each client and then send it messages. You may run into some hiccups with this. For example, the process may check to see if it's already running and refuse to start or otherwise be unhappy.
If you have a known upper limit on the number of clients, another approach you could consider would be to make multiple copies of the original .exe file and then use binary patching (something similar to the detours library from Microsoft Research) to override the COM registration functions and register each copy as a separate COM object.