How to Calculate the energy consumed by Data Centers using CloudSim? - virtual-machine

I want to calculate the energy that data centers consume using cloud SIM::: Is there a way to do this?

Related

Reservation Used Bytes parameter keeps changing

I created a dashboard in the Cloud Monitoring to monitor BI Engine metrics. I have a chart to measure the Reservation Used Bytes. The chart keeps changing values ranging from 30GB to 430MB, according to the chart. The time frame between days and weeks also does not change the measure chart. Why is the measuring changing throughout time to what appears to be from high to low and back to high? and, how can see how many bites have been utilized in total? Seems
You are using a metric that is coupled to current usage, so it is expected to vary over time with increasing or decreasing values.
https://cloud.google.com/bigquery/docs/bi-engine-monitor#metrics
Reservation Used Bytes: Total capacity used in one Google Cloud project
If you need the total bytes you need to switch to this metric:
Reservation Total Bytes Total capacity allocated to one Google Cloud project

Get the cost of an individual pipline execution in Data Factory

I'm looking into using Azure Data Factory V2 for integration imports and want to know if there's a way to track the cost of individual pipelines being run?
For example if I had 3 pipelines which represented 3 different integrations would there be a way to see the cost incurred from each?
Is there also a way to do this in near real time, so that during a month I could somehow put a budget on each integration/pipeline?
The Azure Data Factory bills only show the total cost. We can't get the each pipeline cost in Data Factory, we must manually calculate the price.
We can see the pipeline level consumption: Monitor-->Pipeline run-->Consumption:
Azure document says that "The pipeline run consumption view shows you the amount consumed for each ADF meter for the specific pipeline run, but it does not show the actual price charged". We need manually calculate the cost by Pricing calculator.
For your questions, is there a way to see the cost incurred from each?
No, there isn't. Must manually calculate the cost.
Is there also a way to do this in near real time, so that during a month I could somehow put a budget on each integration/pipeline?
I'm afraid no.
Others have post almost same question, please ref here: Azure Data Factory Pipeline Consumption Details

Google Dataflow instance and BigQuery cost considerations

I am planning to spin up a dataflow instance on google cloud platform to run some experiments. I want to get familiar with, and experiment with using apache beam to pull data from BigQuery, run some ETL jobs (in python) and streaming jobs, and finally store the result in BigQuery.
However, I am also concerned with sending my company's GCP bill through the roof. What are the main cost considerations, or any methods to estimate what the cost will be, so I don't get an earful from my boss.
Any help would be greatly appreciated, thanks!
You can use calculator to get an estimate of price of the job.
One of the most important resource on the dataflow side is CPU per hour. To limit the cpu hours, you can set the maximum machines using option maxNumWorkers in your pipeline.
Here are more pipeline options that you can set while running your dataflow job https://cloud.google.com/dataflow/docs/guides/specifying-exec-params
For BQ, you can do a similar estimate using the calculator.

Stream analytics small rules on high amount of device data

We have the following situation.
We have multiple devices sending data to an event hub (Interval is
one second)
We have a lot of small stream analytics rules for alarm
checks. The rules are applied to a small subset of the devices.
Example:
10000 Devices sending data every second.
Rules for roughly 10 devices.
Our problem:
Each stream analytics query processes all of the input data, although the job has to process only a small subset of the data. Each query filters on device id and filters out the most amount of data. Thus we need a huge number of streaming units which lead to high stream analytics cost.
Our first idea was to create an event hub for each query. However, here we have the problem that each event hub has at least one throughput unit, which leads also to high costs.
What is the best solution in our case?
One possible solution would be to use IoT hub and to create a different Endpoint with a specific Route for the devices you want to monitor.
Have a look to this blog post to see if this will work for your particular scenario: https://azure.microsoft.com/en-us/blog/azure-iot-hub-message-routing-enhances-device-telemetry-and-optimizes-iot-infrastructure-resources/
Then in Azure Stream Analytics, you can use this specific Endpoint as input.
Thanks,
JS (Azure Stream Analytics team)

calculating all the the values of a flow in anylogic

I am writing an economic project in anylogic. I want to sum all the money that flows between two stocks, in fact I need to sum all the values that a flow get during simulation, till a specific condition, how can I do that?
Thanks
Well,
you need to be careful to understand how system dynamics works: it is a continuous process!
If you want to track your flows, the easiest option is to use a dataset object which tracks the flow at specific points in time.
Below, a flow "AdoptionRate" is tracked by dataset "AdoptersDS" every 0.1 minutes:
However, be aware that this tracks the flow at specific points in time. You can set up similar datasets for your stocks as well.
Alternatively, you could write a recurring event which stores the values at specific points in time into your build-in database.