Difference Between Variables And Paramters - gams-math

I am new to GAMS and am struggling to find useful tutorials online. Can someone answer me the following.....
What is the difference between Parameters and Variables. From our college session, they appear to be the same, although I imagine I am missing something.
Our first program has the following code.
Parameters
el_supply Price elasticity of supply in the EU
el_dem Price elasticity of demand in the EU
int_supply Supply intercept in the EU
int_dem Demand intercept in the EU
tp Rate of technical progress
chg_dem Rate of change in demand
;
el_supply = 0.5;
el_dem = -0.1;
Variables
SUPPLY Supply of wheat in the EU (Mio t)
DEMAND Demand of wheat in the EU (Mio t)
NX Net exports of wheat in the EU (Mio t)
PRICE Wheat price in the EU (Euro per t)
;

In short, parameters contain data, variables are decisions made by the optimizer.
But this was actually asked before and answered in detail here: In GAMS, what is the difference between variables and parameters?
If you are looking for a basic tutorial, you can find one here: https://www.gams.com/latest/docs/UG_Tutorial.html

Related

I need some assistance to create loop in demand and supply constraints in GAMS

I am a newbie for using GAMS. Can you do me a favor please?
I try to input two constraints about how to minimize cost of the factory (i) each month (k) :
Constraint 1: Although the sum of crops (tonnes) from selected farmers (j) cannot meet all factory demand, finally they have to sent all to the factory. For example, if the factory need 5 tonnes of crops but sum of crops from farmers number 8,13,25 is added to 3 tonnes, these three farmers should sent 3 tonnes to the main factory i.
Constraint 2: When the sum of crops (tonnes) from selected farmers (j) excess the factory demand, the selected farmer can sent all crops to factory. For example, if the factory need 5 tonnes of crops but sum of crops from farmers number 23 is added to 6 tonnes, this farmer should sent 6 tonnes to the main factory i.
I try to use if,then statement but it is not enable. Maybe while loop will enable. Can you do me a favor please? or any suggestion. This is my idea but I do not know how to coding correctly.
con1(i,k).. sum((j),a(j,k)* b(j)x(i,j,k)) $ (sum((j),a(j,k) b(j)*x(i,j,k))< d(i,k)) =e= d(i,k);
con2(i,k).. sum((j),a(j,k)* b(j)x(i,j,k)) $ (sum((j),a(j,k) b(j)*x(i,j,k))> d(i,k)) =g= d(i,k);
a(j,k) is crop yield planning for each farmer in month (k) that they need to be added together by using binary variable x(i,j,k) to meet the factory demend as shows in d(i,k)
c(j,k) is the number of areas each farmer.
h(i,j) is distance bt. each farmer to factory
m(i,j) is transportaion cost
My objective is to minimize cost. I afraid that con1 will not work.
However, I look forward to hearing from you. Thanks
enter image description here

Is there an API to get a full citation (such as a BibTeX or JSON citation) from an arbitrary URL?

Say I have a URL like https://www.science.org/doi/10.1126/science.abb4363, how can I get the full citation as:
#article{
doi:10.1126/science.abb4363,
author = {Sergio Almécija and Ashley S. Hammond and Nathan E. Thompson and Kelsey D. Pugh and Salvador Moyà-Solà and David M. Alba },
title = {Fossil apes and human evolution},
journal = {Science},
volume = {372},
number = {6542},
pages = {eabb4363},
year = {2021},
doi = {10.1126/science.abb4363},
URL = {https://www.science.org/doi/abs/10.1126/science.abb4363},
eprint = {https://www.science.org/doi/pdf/10.1126/science.abb4363},
abstract = {There has been much focus on the evolution of primates and especially where and how humans diverged in this process. It has often been suggested that the last common ancestor between humans and other apes, especially our closest relative, the chimpanzee, was ape- or chimp-like. Almécija et al. review this area and conclude that the morphology of fossil apes was varied and that it is likely that the last shared ape ancestor had its own set of traits, different from those of modern humans and modern apes, both of which have been undergoing separate suites of selection pressures. Science, this issue p. eabb4363 A Review describes the unique and varied morphologies in fossil and modern apes, including humans. Humans diverged from apes (chimpanzees, specifically) toward the end of the Miocene ~9.3 million to 6.5 million years ago. Understanding the origins of the human lineage (hominins) requires reconstructing the morphology, behavior, and environment of the chimpanzee-human last common ancestor. Modern hominoids (that is, humans and apes) share multiple features (for example, an orthograde body plan facilitating upright positional behaviors). However, the fossil record indicates that living hominoids constitute narrow representatives of an ancient radiation of more widely distributed, diverse species, none of which exhibit the entire suite of locomotor adaptations present in the extant relatives. Hence, some modern ape similarities might have evolved in parallel in response to similar selection pressures. Current evidence suggests that hominins originated in Africa from Miocene ape ancestors unlike any living species.}}
I was able to download the citation by visiting the link manually, but are there any programmatic APIs to convert a URL (like even a Wikipedia URL) into a formal citation? If not, I am not sure what is the recommended approach to getting these efficiently.

Setting initial values for non-linear parameters via tabuSearch

I'm trying to fit the lppl model to KLSE index to predict the most probable crash time. Many papers suggested tabuSearch to identify the initial value for non-linear parameters but none of them publish their code. I have tried to fit the mentioned index with the help of NLS And Log-Periodic Power Law (LPPL) in R. But the obtained error and p values are not significant. I believe that the initial values are not accurate. Can anyone help me on how to find the proper initial values?
library(tseries)
library(zoo)
ts<-get.hist.quote(instrument="^KLSE",start="2003-04-18",end="2008-01-30",quote="Close",provider="yahoo",origin="1970-01-01",compression="d",retclass="zoo")
df<-data.frame(ts)
df<-data.frame(Date=as.Date(rownames(df)),Y=df$Close)
df<-df[!is.na(df$Y),]
library(minpack.lm)
library(ggplot2)
df$days<-as.numeric(df$Date-df[1,]$Date)
f<-function(pars,xx){pars$a + (pars$tc - xx)^pars$m *(pars$b+ pars$c * cos(pars$omega*log(pars$tc - xx) + pars$phi))}
resids<-function(p,observed,xx){df$Y-f(p,xx)}
nls.out <- nls.lm(par=list(a=600,b=-266,tc=3000, m=.5,omega=7.8,phi=-4,c=-14),fn = resids, observed = df$Y, xx = df$days, control= nls.lm.control (maxiter =1024, ftol=1e-6, maxfev=1e6))
par<-nls.out$par
nls.final<-nls(Y~(a+(tc-days)^m*(b+c*cos(omega*log(tc-days)+phi))),data=df,start=par,algorithm="plinear",control=nls.control(maxiter=10024,minFactor=1e-8))
summary(nls.final)
I would look at some of the newer research on this topic, there is a good trig modification that will practically guarantee a singular optimization. Additionally, you can use r's built in linear equation solver, to find the linearizable parameters, ergo you will only need to optimize in 3 dimensions. The link below should get you started. I would cite recent literature and personal experience to strongly advise against a tabu search.
https://www.ethz.ch/content/dam/ethz/special-interest/mtec/chair-of-entrepreneurial-risks-dam/documents/dissertation/master%20thesis/MAS_final_Tuncay.pdf

How to model measures that depend on the underlying substance

I'm using the Aconcagua measurement library in Pharo. I've had a lot of success using it to model things like days and kilometers, but have encountered an interesting problem where converting between units requires information on the underlying substance being measured. The formula for expressing the amount of a substance in air in parts per million, given the amount in milligrams per cubic meter is:
; where mw is the molecular weight of the material.
I'm envisioning usage like:
tlvCO := carbonMonoxide tlv. "returns the Threshold limit Value as 29 mg/m3"
...
tlvCO convertTo: PPM "where PPM is an Aconcagua unit"
The problem is that, while the examples I've seen of measurements in Aconcagua are contain in themselves all the info you need for conversion, in this case, you have to know the molecular weight of the underlying substance being measured. Thus mg/m3 -> ppm is not inherently meaningful. A properly formed question would be mg/m3 of ammonia -> ppm.
My instinct is to either:
create a new class like MaterialQuantity which has a material and a measure, or
create a special unit subclass that has a material
But I'm not 100% sold and would like some input...
I don't think that molecular weight is part of the unit, but part of a calculation, like the 24.45 (which is not clear, but it seems that is an average you consider for air molecular mass).
I am not sure that ppm is a unit that you can convert to a density unit, because they belong in different domains.
As far as i understand, you need to reify tlv as a compound unit or formula, which you can ask for the element. Then you could simply do something like [:tlv | tlv * ( 24.45 / tlv element) ]

Optimizing Portfolio With Bounds on Weights and Costs

I wish to create efficient frontiers for portfolios with bounds on both weights and costs. The following code provides the frontiers for portfolios in which the underlying assets are bounded with minimum and maximum weights. How do I add to this a secondary constraint in which the combined annual charges of the underlying assets do not exceed a maximum? Assume each asset has an annual cost which is applied as a percentage. As such the combined weights*charges should not exceed x%.
lb=Bounds(:,1);
ub=Bounds(:,2);
P = Portfolio('AssetList', AssetList,'LowerBound', lb, 'UpperBound', ub, 'Budget', 1);
P = P.estimateAssetMoments(AssetReturns);
[Passetmean, Passetcovar] = P.getAssetMoments;
Correlations=corrcoef(AssetReturns);
% Estimate Frontier
pwgt = P.estimateFrontier(20);
[prsk, pret] = P.estimatePortMoments(pwgt);
Mary,
having entered another set of constraint principles into the model, kindly notice, that the modified efficient frontier problem is out of the grounds of a guaranteed convex-optimisation problem.
Thus one may forget about a comfort of all the popular fmicg(), l-bgfs et al solvers.
This will not simply have a SLOC one-liner to get answer(s) out of the box.
Non-linear problems will require ( the wilder, the more ... ) you to assemble another optimisation function, be it either
a brute-force based scanner,
with a fully orthogonal mesh scanned, with "utility function" defined so that, as the given requirement states, it incorporates also the add-on cost-of-beholding a Portfolio item
or
a genetic-algorithm based approach,
in a belief, the brute-force one might become as time-extensive as to cease to be a feasible approach and a GA-evolution may yield acceptable sub-optimal (local optima) outputs