I am trying to write the following code but it gives me "syntax error, unexpected forall".
How do I fix this?
maximize sum(i in cargos, j in comps) profit[i]*x[i][j];
subject to {
cons01
forall(i in cargos)
available_wight:
sum(j in comps) x[i][j] <= weight[i];
cons02:
forall (j in comps)
weight_capacity:
sum(i in cargos)x[i][j] <= weight_cap[j];
cons03;
forall (j in comps)
space_capacity;
sum(i in cargos)valume[i]*x[i][j] <= space_cap[j];
remove
cons01
available_wight: is the label of the constraint.
regards
Related
The first assertion says that all the values in missing array are 0. The other says at least 1 value in missing is 0. If the first assertion is true, the second must also be true.
If anyone wants the full code, it is given below. It is essentially a program that finds the missing number in the given array.
method FindMissing(n: int, a: array<int>) returns (m: int)
requires 2 <= n <= 2 * 100000
requires a.Length == n - 1
requires forall j, k :: 0 <= j < k < a.Length ==> a[j] != a[k]
requires forall k :: 0 <= k < a.Length ==> 1 <= a[k] <= n
ensures 1 <= m <= n
ensures forall k :: 0 <= k < a.Length ==> a[k] != m
{
var missing := new int[n];
var i := 0;
while i < missing.Length
invariant 0 <= i <= missing.Length
invariant forall k :: 0 <= k < i ==> missing[k] == 0
{
missing[i] := 0;
i := i + 1;
}
assert forall k :: 0 <= k < missing.Length ==> missing[k] == 0;
assert exists k :: 0 <= k < missing.Length && missing[k] == 0;
i := 0;
while i < a.Length
invariant 0 <= i <= a.Length
invariant forall k :: 0 <= k < i ==> missing[a[k] - 1] == 1
invariant forall k :: i <= k < a.Length ==> missing[a[k] - 1] == 0
invariant exists k :: 0 <= k < missing.Length && missing[k] == 0
{
missing[a[i] - 1] := 1;
i := i + 1;
}
assert exists k :: 0 <= k < missing.Length && missing[k] == 0;
i := 0;
while i < missing.Length {
if missing[i] == 0 {
m := i + 1;
break;
}
i := i + 1;
}
}
To prove an existential, you often have to supply a witness. To do that for the assertion, add another assertion with the witness in front of it. Like this:
assert missing[0] == 0;
assert exists k :: 0 <= k < missing.Length && missing[k] == 0;
For the existential quantifier in the loop invariant, I suggest you introduce another variable to hold the witness. If that variable is only used in the proof, you can make it a ghost. You code will look something like:
ghost var indexOfMissing := 0;
i := 0;
while i < a.Length
...
invariant 0 <= indexOfMissing < missing.Length && missing[indexOfMissing] == 0
You'll have to manually update indexOfMissing inside the loop to maintain the invariant.
Here is one other point about logic and two points about Dafny:
You started by saying the forall implies the exists. This is not true if the range of the quantifiers is empty. Luckily, in your case, do you have missing.Length != 0.
Your first loop initializes the elements of missing to 0. There are two simpler ways to do that in Dafny. One is to use an aggregate statement that performs a bunch of simultaneous assignments:
forall i | 0 <= i < missing.Length {
missing[i] := 0;
}
The other is to give new a function that says how to initialize the elements.
var missing := new int[n](i => 0);
The beauty with both of these is that they are not loops, and that means you don't need to maintain a loop index and write various loop invariants.
You can also eliminate the final loop if you use an assign-such-that statement:
m :| 0 <= m < missing.Length && missing[m] == 0;
Trying to prove a simple algorithm on Dafny, but I just get an "assertion violation" on the last assertion with no extra details. Can anybody spot what is wrong and how to fix it? Formal methods is not my specialty.
method BubbleSort(a: array?<int>)
modifies a
requires a != null
ensures sorted(a, 0, a.Length -1)
{
var i := a.Length - 1;
while(i > 0)
decreases i
invariant i < 0 ==> a.Length == 0
invariant -1 <= i < a.Length
invariant sorted(a, i, a.Length -1)
invariant partitioned(a, i)
{
var j := 0;
while(j < i)
decreases i - j
invariant 0 < i < a.Length && 0 <= j <= i
invariant sorted(a, i, a.Length -1)
invariant partitioned(a, i)
invariant forall k :: 0 <= k <= j ==> a[k] <= a[j]
{
if(a[j] > a[j+1])
{
a[j], a[j+1] := a[j+1], a[j];
}
j := j + 1;
}
i := i - 1;
}
}
predicate sorted(a: array?<int>, l: int , u: int)
reads a //Sintaxe do Dafny. PRECISA disso para dizer que vai ler o array
requires a != null
{
forall i, j :: 0 <= l <= i <= j <= u < a.Length ==> a[i] <= a[j]
}
predicate partitioned(a: array?<int>, i: int)
reads a
requires a != null
{
forall k, k' :: 0 <= k <= i < k' < a.Length ==> a[k] <= a[k']
}
method testSort()
{
var b := new int[2];
b[0], b[1] := 2, 1;
assert b[0] == 2 && b[1] == 1;
BubbleSort(b);
assert b[0] == 1 && b[1] == 2;
}
The problem is that the postcondition (ensures clause) of Sort gives no information about the state of A. When Dafny does verification, it verifies each method independently, using only the specifications (not the bodies) of other methods. So when Dafny verifies testSort, it doesn't look at the definition of Sort, but only its postcondition true, which isn't enough to prove your assertions.
For more information, see the FAQ and the section on assertions in the tutorial.
I'm trying write in OPL this sum:
I did this, but it is not exactly what I need.
forall (n in cont, t in tempo, o in portos)
sum(i in colunap, j in linhap)b[i][j][n][t] + v[n][t] == 1;
I should be something like, but opl does not accept it:
forall (n in cont[o], t in tempo[o], o in portos)
sum(i in colunap[o], j in linhap[o])b[i][j][n][t] + v[n][t] == 1;
This should work:
int P=3;
int H[1..P-1] = [1 , 2];
range linhap=1..max(o in 1..P-1) H[o];
I have the following model with a variable that is a value from a vector (index of p in objective function)
But AMPL displays an error: subscript variables are not yet allowed.
How can I do to implement this kind of addressing in objective function?
Thanks in advance and best regards.
Gabriel
param dimension;
set T:={1..dimension};
set O:={0};
set V:= O union T;
param c{i in V, j in V};
param p{i in V};
set ady{i in V} within V := {j in V : i<>j and c[i,j] <> -1} ;
# Variables
var x{i in V, j in V} binary;
var u{i in V} integer;
# Objective
minimize costo: sum{i in V, j in V} p[u[i]-1] * x[i,j] * c[i,j];
# Constraints
s.t. grado_a {j in V} : sum{i in ady[j] : j <> i} x[i,j] = 1;
s.t. grado_b {i in V} : sum{j in ady[i] : i <> j} x[i,j] = 1;
s.t. origen {i in O} : u[i] = 0;
s.t. sigo_1 {i in T} : u[i] >=1;
s.t. sigo_2 {i in T} : u[i] <= card(V) -1;
s.t. precedencia {i in T, j in T : i <> j} : u[i] - u[j] + 1 <= (card(V) - 1)*(1 - x[i,j]) ;
AMPL doesn't allow variables in subscripts yet. However, the ilogcp driver for AMPL supports the element constraint, for example:
include cp.ampl;
minimize costo:
sum{i in V, j in V} element({v in V} p[v], u[i] - 1) * x[i,j] * c[i,j];
where element({v in V} p[v], u[i] - 1) is equivalent to p[u[i] - 1] and is translated into an IloElement constraint.
I have been trying to solve seriation problem by using GNU. But I couldn't write a summation like the following.
param n, integer, >= 3;
set O := 1..n;
param d{i in O,j in O};
var x{i in O,j in O}, binary, i < j;
var v{i in O,j in O,k in O}, binary, i < j < k;
maximize total: sum{i in O,j in O, i<j}(d[i,j] - d[j,i])* x[i,j] + sum{i in O,j in O, i<j}d[j,i];
s.t. tran{i in O,j in O,k in O, i<j<k}: x[i,j] + x[j,i] - x[i,k] + v[i,j,k] = 1;
Thanks
You should use : instead of , in the "such that" clause i < j:
sum{i in O,j in O: i < j} ...
# ^ note ':' here