How to Identify Overlapping Resources in Multiple Projects - Microsoft Office - vba

My questions is as follows:
Is there a way to make a master project plan incorporating multiple projects (not necessarily sub projects) in Microsoft office. As our resources get pushed around due to parts and items not arriving and shipping dates being moved forward amongst other things etc, sometimes the resource plan has to be changed regularly. I want to be able to pull all current projects into a master project plan so that I can identify where project resources are overlapping. Not necessarily by task but more by employee.
To try and explain a bit better:
Project 1:
Project Manager: John
Project Engineer: Jack. Task - Drawing
John assigns Jack to work on a task in Project 1.
Project 2:
Project Manager: Mark
Project Engineer: Jack. Task - Documentation
Jack wasn't supposed to be working on Project 2 for a further 2 weeks but the deadline has been moved forward and Mark has also assigned Jack to work on a task on his project.
I'd like to create a master project where I can pull in Project 1 and Project 2 and find a way for it to identify the resource overlap, regardless of the fact that Jack has been assigned to 2 different tasks, but more because he as an employee has been assigned to two projects at once.
Is this possible?
On a larger scale realistically I'll need this to incorporate about 6 Projects and about 20 staff members across, so I can find all the overlaps.
I am aware that there is a way to split a person between 2 tasks by assigning a percentage that they will work on both, i.e 90% on task 1 and 10% on task 2 but obviously this won't be project exclusive and my aim is to identify the overlaps rather than create them on purpose for resource sharing.

You are basically looking for software to support your "resource leveling" process.
The ]project-open[ open-source PPM software is capable of importing MS-Project schedules. After the import, you can get Resource Management reports from the system that allow you to perform manual resource leveling.
Another interesting open-source tool is TaskJuggler. TJ actually does multi-project scheduling or automatic resource leveling. However, TJ does not include a MS-Project integration at the moment AFAIK.
Affiliation note: I'm a member of the ]project-open[ team.

Depending on how you want to approach this, you can have a single master project and make all of your other project files sub projects in that master file. You can work in this master file and it will save your updates to the other files. It is sort of like having Project server on your desktop. You can create veiws for each project, or simply roll them up and they show as a single summary task.

You can do this by using a shared resource file but the best way now would be to just get a Project Online subscription and do your projects there.

Related

Is this order correct for running the processors and options for azure-devops-migration-tools (using version 11.12.21)

Love the tool - some serious work has gone into it. If you are one of the maintainers reading this - let me start by saying thank you!
I am just reviewing the layering of steps for a cloud-based org to cloud-based org ADO migration (this is after working through a dry run migration).
Currently I have the below order.
I am unsure that I have understood the documentation correctly, does anyone know if step 9 should run directly after step 8 or if step 9 is reliant on step 10 to be ran first (changing the order to 8 -> 10 -> 9)
An additional note for others trying, in our setup we found that for Step 3, we had to also exclude the Shared Steps, Shared Parameters as well - as they are part of the Test scope. I will make an issue in the project on that item.
I appreciate any advice that can be offered - along with if you see any other mistakes in our approach. Thank you!
Step
Item
Notes
Dependents
1
Migrate the Org level inherited process
No errors
2
Create project with inherited process and then attach Custom.ReflectedWorkItemId to all work item types
Required by all below items
3
WorkItemMigrationConfig-configuration.json
Also migrates all areas and paths
Provides Area and Paths, below items heavily dependent on this
4
TfsTeamSettingsProcessorOptions
Teams,Areas,Paths are the 3 core components of Projects in the metamodel. Items below dependent on this includes Boards, Security group configuration
5
TfsSharedQueryProcessorOptions
Boards is dependent on this
6
TestVeriablesMigrationConfig
TestConfigurationsMigrationConfig is dependent on this
7
TestConfigurationsMigrationConfig
TestPlansAndSuitsMigrationConfig is dependent on this AND Delta-WorkItemMigrationConfig-configuration must be run in sequence afterwards this step. This has to be run before LinkMigrationConfig
8
TestPlansAndSuitsMigrationConfig
Must be run in order after TestConfigurationsMigrationConfig. The Test_scope_WorkItemMigrationConfig-configuration.json should be run after this
9
LinkMigrationConfig
This has not been run. The documentation suggests this should be run and run in order
10
Test_scope_WorkItemMigrationConfig-configuration.json
Current Assumption: Required for Work item migration for Test suite, Test Plan, Shared Steps, Shared Parameters
Ordering between this and LinkMigrationConfig unclear
11
Boards
12
Security Groups
Looks possible to achieve with code against the REST API for reproducibility
13
Wikis
Takes 2 minutes to complete the process using the ADO Web UI
14
Repos
Looks possible to write a REST API piece of code to automate for reproducibility
15
AzureDevOpsPipelineProcessorOptions
The processor includes scope of BuildPipelines, ReleasePipelines, TaskGroups, VariableGroups, ServiceConnections
In theory this could require Security Groups to be in place (although I don't think it is a dependency to achieve migration - only for execution)

Bamboo Deployment Projects - Can You Call Projects from Other Projects?

I have inherited a collection of Bamboo build plans and corresponding deployment projects. Here is a particular example of how I would like to leverage reuse. We a four deployment projects (say ProjA, ProjB, ProjC, ProjD) that can be run individually/independently. However, we also have a project where we deploy them together (call it ProjABCD). Currently, ProjABCD replicates the steps from the individual A, B, C, D projects and executes them consecutively (e.g., if each has 10 steps, ProjABCD has 40 steps).
Is there a way to have a super-project (ProjABCD) that simply simply "calls" the four individual ProjA, ProjB, ProjC, ProjD? Ideally, this would also be able to roll-back to baseline state if any project in the group fails to deploy properly.
You should be able to come up with a scheme using the triggers functionality. There should not be a need to have a 40 step combined deployment (that breaks a lot of the value the deployments provide). There are multiple ways to solve/simplify this problem. For example you may choose to have two different builds, and only one of them triggers the 4 deployments.

VBA code to run against Microsoft Project Templates

I have to generate macros to be run in Microsoft project.
I am not able to perform the calculations to get a result, even after a lot of research.
It is 14 quality checks for your IT project schedule.
I am trying to perform the easiest one first that is Resource Check.
Resources Check identifies all the tasks that do not have resources (people or costs) assigned. A quality schedule has all resources assigned to tasks in the schedule.
Green Flag = < 5% of tasks meeting any of the above logic.
Red Flag = > 5% of tasks meeting any of the above logic.
How do I perform this?
Try googling Eversight for Microsoft Project - it's a free add on for project 2010 that allows you to set up your own quality check profiles, including your own RAG thresholds.

TFS2010 database size

We've been using TFS since around 2009 when we installed TFS2008. We upgraded to TFS2010 at some point and we've been using it for source control, work item management, builds etc.
Our TFSVersionControl.mdf file is 287,120,000 KB (273GB). We ran some queries and found that our tbl_BuildInformationField table is massive. It has 1,358,430,452 rows which takes up 150,988,624 KB (143GB). We have multiple active products over multiple active builds which more than one solution per build and the solutions aren't free of warning messages.
My questions:
Is it possible to stop MSBuild from spamming the
tbl_BuildInformationField table so much? I.e. only write errors and
general build information and not all the warnings for every
project?
Is there a way to purge or clean up old data from this
table?
Is 273GB for 4 years of TFS use an average size?
Is 143GB for tbl_BuildInformationField a "normal" size?
The table holds the values and output of build process. Take note that build retention policy doesnt actualy delete the build object like everything else in TFS the object is marked deleted and only public visibility and drop location is cleared.
I would suggest if you have retainened same build definitions for very long time (when build definition is deleted the related objects get removed as well) you should query for build info including deleted ones using TFS api, the same api will also alow you to remove them for good. Deleting build definitions probably will not work and will fail with timeout error.
You can consult the following:
http://blogs.msdn.com/b/adamroot/archive/2009/06/12/working-with-deleted-build-data-in-team-foundation-server-2010-beta-1.aspx

TFS Builds, Project Files: Orphaned references to files not being pushed are causing endless build errors

We are using TFS 2010 (Visual Studio) for our deployments and have client code projects (.csproj files) and database projects (.dbproj files) We understand that when our Developers add files to our application there is a corresponding reference to these files in the Project file. If I push a changeset from Dev to QA that includes the project file, and the project file contains a reference to a file that's been added that is not in the changeset, I will receive a build error.
Once we started pushing just changesets (as opposed to performing full builds) this quickly became our number one bottleneck in doing TFS builds. I would deploy the database project and there would be 20 errors. The only way I could correct them was to navigate down the entire solution explorer tree and exclude each orphaned reference individually. This has proved far too time consuming and on the advice of our lead programmer we have returned to doing full builds of QA and UAT.
We are in the early stages of this product, and therefore we will be adding many files for some time. We need a better solution for this problem. Neither the manual exclusions nor asking developers to not check in code until it is ready for qa will suffice for us. Has anybody out there had any experience with this problem and if so how did you deal with it? Thanks!
Jon
Pushing changesets to QA selectively is known as cherry picking and causes the sorts of issues that you are experiencing. This is not the recommended practice, instead setup the Qa build so that successful build is part of checkin. This way that if a part of a fix is missed ( as it may be in multiple change sets ) the build will fail and the checkin cannot be performed.
Second have the developers do the second checkin to QA or merge the dev change sets to Qa and have the team lead coordinate changes to project files by watching for changes by turning on "notify changes made by others " or setting a policy for the dev team. Full builds should always be done as partials do not always pick up the complete pick up the dependency graph.