Appending a pointer to a Slice in Golang - variables

I want to append a pointer to a slice.Is it possible..?In Partentnode.children is a slice I want to append it with X as pointer.
https://play.golang.org/p/ghWtxWGOAU
func Tree(Parentnode *Node) {
if IsvisitedNode(Parentnode.currentvalue - 1) {
m := MovesArray[Parentnode.currentvalue-1]
for j := 0; j < 8; j++ {
if m[j] != 0 {
var X *Node
X.parentnode = Parentnode
X.currentvalue = m[j]
if IsvisitedNode(m[j]) {
Parentnode.children = append(Parentnode.children, *X)
Tree(X)
}
}
}
}
}

You have a off by one error.
In main you set Y.currentvalue = 1.
Then in Tree currentvalue walks to 64.
X.currentvalue = m[j]
fmt.Printf("cv: %v\n",X.currentvalue) //walks to 64
if IsvisitedNode(m[j]) {
An in IsvisitedNode you test that index against visithistory that has 64 indexes, thus stops at index 63. -> index error
var visithistory [64]bool
func IsvisitedNode(position int) bool {
if visithistory[position] == true {
Things work if you set var visithistory [65]bool but I think you need to rethink you logic here somewhat.

Related

Index out of bounds while checking if a string is rotated

// This function checks if two string are rotation of itself
//
// #Arguments
//
// 'str1' - a str type reference to store one string to check
// 'str2' - a str type reference to store other string to check
//
// #Return
//
// Return a bool value denoting if the string are rotation of each other
pub fn is_rotation(str1: &str, str2: &str) -> bool {
let len1 = str1.len();
let len2 = str2.len();
let string1: Vec<char> = str1.chars().collect();
let string2: Vec<char> = str2.chars().collect();
if len1 != len2 {
return false;
}
let mut longest_prefix_suffix = vec![0, len1];
let mut prev_len = 0;
let mut i = 1;
while i < len1 {
if string1[i] == string2[prev_len] {
prev_len += 1;
longest_prefix_suffix[i] = prev_len;
i += 1;
} else if prev_len == 0 {
longest_prefix_suffix[i] = 0;
i += 1;
} else {
prev_len = longest_prefix_suffix[prev_len - 1];
}
}
i = 0;
let mut k = longest_prefix_suffix[len1 - 1];
while k < len2 {
if string2[k] != string1[i] {
return false;
}
i += 1;
k += 1;
}
true
}
When I run the code, I receive the following error:
thread 'main' panicked at 'index out of bounds: the len is 2 but the index is 2', src/rotation.rs:29:13
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
How would I solve this?
It looks like there is a typo for longest_prefix_suffix. I assume you intended to write the following:
let mut longest_prefix_suffix = vec![0; len1];
Note the ; between 0 and len1.
The use of a , created a Vec with two elements.
Alternatively, an easier way might be the following:
fn is_rotation(s1: &str, s2: &str) -> bool {
if s1.len() != s2.len() {
return false;
}
s1.repeat(2).contains(&s2)
}
assert!(is_rotation("hello", "ohell") // true
assert!(is_rotation("hello", "olleh") // false

Return Option inside Loop

The program aims to use a loop to check if the index of a iterator variable meets certain criteria (i.g., index == 3). If find the desired index, return Some(123), else return None.
fn main() {
fn foo() -> Option<i32> {
let mut x = 5;
let mut done = false;
while !done {
x += x - 3;
if x % 5 == 0 {
done = true;
}
for (index, value) in (5..10).enumerate() {
println!("index = {} and value = {}", index, value);
if index == 3 {
return Some(123);
}
}
return None; //capture all other other possibility. So the while loop would surely return either a Some or a None
}
}
}
The compiler gives this error:
error[E0308]: mismatched types
--> <anon>:7:9
|
7 | while !done {
| ^ expected enum `std::option::Option`, found ()
|
= note: expected type `std::option::Option<i32>`
= note: found type `()`
I think the error source might be that a while loop evaluates to a (), thus it would return a () instead of Some(123). I don't know how to return a valid Some type inside a loop.
The value of any while true { ... } expression is always (). So the compiler expects your foo to return an Option<i32> but finds the last value in your foo body is ().
To fix this, you can add a return None outside the original while loop. You can also use the loop construct like this:
fn main() {
// run the code
foo();
fn foo() -> Option<i32> {
let mut x = 5;
loop {
x += x - 3;
for (index, value) in (5..10).enumerate() {
println!("index = {} and value = {}", index, value);
if index == 3 {
return Some(123);
}
}
if x % 5 == 0 {
return None;
}
}
}
}
The behaviour of while true { ... } statements is maybe a bit quirky and there have been a few requests to change it.

"Binary/Unary operator '++/<' cannot be applied to an operand of type" AND "Use of unresolved identifier '=-'"

I am translating an Obj-C app to Swift and having trouble dealing with some syntax. I believe I have declared the variable types correctly so I don't know why I'm be getting these errors. Maybe some blocks are located incorrectly inside classes/functions when they should be outside or something. I would love it if you could review my code. I'm new to programming so what may be a clear and explicit explanation to you probably will still be vague for me so please show with examples using existing names.
Thanks
"Unary operator '++' cannot be applied to an operand of type 'Int?'"
and
"Binary operator '<' cannot be applied to an operand of type 'Int? and Float'"
and
"Use of unresolved identifier '=-'"
import UIKit
import Foundation
import AVFoundation
let minFramesForFilterToSettle = 10
enum CurrentState {
case statePaused
case stateSampling
}
class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
let session = AVCaptureSession()
var camera : AVCaptureDevice?
var validFrameCounter: Int = 0
var pulseDetector: PulseDetector!
var filter: Filter!
var currentState = CurrentState.stateSampling // Is this initialized correctly?
override func viewDidLoad() {
super.viewDidLoad()
self.pulseDetector = PulseDetector()
self.filter = Filter()
// TO DO startCameraCapture() // call to un-used function.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
let NZEROS = 10
let NPOLES = 10
class Filter {
var xv = [Float](count: NZEROS + 1, repeatedValue: 0)
var yv = [Float](count: NPOLES + 1, repeatedValue: 0)
func processValue(value: Float) -> Float {
let gain: Float = 1.894427025e+01
xv[0] = xv[1]; xv[1] = xv[2]; xv[2] = xv[3]; xv[3] = xv[4]; xv[4] = xv[5]; xv[5] = xv[6]; xv[6] = xv[7]; xv[7] = xv[8]; xv[8] = xv[9]; xv[9] = xv[10]; xv[10] = value / gain;
yv[0] = yv[1]; yv[1] = yv[2]; yv[2] = yv[3]; yv[3] = yv[4]; yv[4] = yv[5]; yv[5] = yv[6]; yv[6] = yv[7]; yv[7] = yv[8]; yv[8] = yv[9]; yv[9] = yv[10];
yv[10] = (xv[10] - xv[0]) + 5 * (xv[2] - xv[8]) + 10 * (xv[6] - xv[4])
+ ( -0.0000000000 * yv[0]) + ( 0.0357796363 * yv[1])
+ ( -0.1476158522 * yv[2]) + ( 0.3992561394 * yv[3])
+ ( -1.1743136181 * yv[4]) + ( 2.4692165842 * yv[5])
+ ( -3.3820859632 * yv[6]) + ( 3.9628972812 * yv[7])
+ ( -4.3832594900 * yv[8]) + ( 3.2101976096 * yv[9]);
return yv[10];
}
}
let maxPeriod = 1.5 // float?
let minPeriod = 0.1 // float?
let invalidEntry:Double = -11
let maxPeriodsToStore:Int = 20
let averageSize:Float = 20
class PulseDetector {
var upVals: [Float] = [averageSize]
var downVals: [Float] = [averageSize]
var upValIndex: Int?
var downValIndex: Int?
var lastVal: Float?
var periodStart: Float?
var periods: [Double] = []
var periodTimes: [Double] = []
var periodIndex: Int?
var started: Bool?
var freq: Float?
var average: Float?
var wasDown: Bool?
func reset() {
for var i=0; i < maxPeriodsToStore; i++ {
periods[i] = invalidEntry
}
for var i=0; i < averageSize; i++ { // why error when PulseDetector.h said averageSize was an Int?
upVals[i] = invalidEntry
downVals[i] = invalidEntry
}
freq = 0.5
periodIndex = 0
downValIndex = 0
upValIndex = 0
}
func addNewValue(newVal:Float, atTime:Double) -> Float {
// we keep track of the number of values above and below zero
if newVal > 0 {
upVals[upValIndex!] = newVal
upValIndex++
if upValIndex >= averageSize {
upValIndex = 0
}
}
if newVal < 0 {
downVals[downValIndex] =- newVal
downValIndex++
if downValIndex >= averageSize {
downValIndex = 0
}
}
// work out the average value above zero
var count: Float
var total: Float
for var i=0; i < averageSize; i++ {
if upVals[i] != invalidEntry {
count++
total+=upVals[i]
}
}
var averageUp = total/count
// and the average value below zero
count=0;
total=0;
for var i=0; i < averageSize; i++ {
if downVals[i] != invalidEntry {
count++
total+=downVals[i]
}
}
var averageDown = total/count
// is the new value a down value?
if newVal < (-0.5*averageDown) {
wasDown = true
}
// original Objective-C code
PulseDetector.h
#import <Foundation/Foundation.h>
#define MAX_PERIODS_TO_STORE 20 // is this an Int?
#define AVERAGE_SIZE 20 // is this a Float?
#define INVALID_PULSE_PERIOD -1 // done
#interface PulseDetector : NSObject {
float upVals[AVERAGE_SIZE];
float downVals[AVERAGE_SIZE];
int upValIndex;
int downValIndex;
float lastVal;
float periodStart;
double periods[MAX_PERIODS_TO_STORE]; // this is an array!
double periodTimes[MAX_PERIODS_TO_STORE]; // this is an rray !!
int periodIndex;
bool started;
float freq;
float average;
bool wasDown;
}
#property (nonatomic, assign) float periodStart; // var periodStart = float?
-(float) addNewValue:(float) newVal atTime:(double) time; // declaring a method called addNewValue with 2 arguments called atTime and time that returns a float
-(float) getAverage; // declaring a method called getAverage that returns a float
-(void) reset; // declaring a method that returns nothing
#end
PulseDetector.m
#import <QuartzCore/QuartzCore.h>
#import "PulseDetector.h"
#import <vector>
#import <algorithm>
#define MAX_PERIOD 1.5
#define MIN_PERIOD 0.1
#define INVALID_ENTRY -100 // is this a double?
#implementation PulseDetector
#synthesize periodStart;
- (id) init
{
self = [super init];
if (self != nil) {
// set everything to invalid
[self reset];
}
return self;
}
-(void) reset {
for(int i=0; i<MAX_PERIODS_TO_STORE; i++) {
periods[i]=INVALID_ENTRY;
}
for(int i=0; i<AVERAGE_SIZE; i++) {
upVals[i]=INVALID_ENTRY;
downVals[i]=INVALID_ENTRY;
}
freq=0.5;
periodIndex=0;
downValIndex=0;
upValIndex=0;
}
-(float) addNewValue:(float) newVal atTime:(double) time {
// we keep track of the number of values above and below zero
if(newVal>0) {
upVals[upValIndex]=newVal;
upValIndex++;
if(upValIndex>=AVERAGE_SIZE) {
upValIndex=0;
}
}
if(newVal<0) {
downVals[downValIndex]=-newVal;
downValIndex++;
if(downValIndex>=AVERAGE_SIZE) {
downValIndex=0;
}
}
// work out the average value above zero
float count=0;
float total=0;
for(int i=0; i<AVERAGE_SIZE; i++) {
if(upVals[i]!=INVALID_ENTRY) {
count++;
total+=upVals[i];
}
}
float averageUp=total/count;
// and the average value below zero
count=0;
total=0;
for(int i=0; i<AVERAGE_SIZE; i++) {
if(downVals[i]!=INVALID_ENTRY) {
count++;
total+=downVals[i];
}
}
float averageDown=total/count;
// is the new value a down value?
if(newVal<-0.5*averageDown) {
wasDown=true;
}
// is the new value an up value and were we previously in the down state?
if(newVal>=0.5*averageUp && wasDown) {
wasDown=false;
// work out the difference between now and the last time this happenned
if(time-periodStart<MAX_PERIOD && time-periodStart>MIN_PERIOD) {
periods[periodIndex]=time-periodStart;
periodTimes[periodIndex]=time;
periodIndex++;
if(periodIndex>=MAX_PERIODS_TO_STORE) {
periodIndex=0;
}
}
// track when the transition happened
periodStart=time;
}
// return up or down
if(newVal<-0.5*averageDown) {
return -1;
} else if(newVal>0.5*averageUp) {
return 1;
}
return 0;
}
-(float) getAverage {
double time=CACurrentMediaTime();
double total=0;
double count=0;
for(int i=0; i<MAX_PERIODS_TO_STORE; i++) {
// only use upto 10 seconds worth of data
if(periods[i]!=INVALID_ENTRY && time-periodTimes[i]<10) {
count++;
total+=periods[i];
}
}
// do we have enough values?
if(count>2) {
return total/count;
}
return INVALID_PULSE_PERIOD;
}
#end
Your problem is that you didn't copied the defines:
#define MAX_PERIODS_TO_STORE 20 // is this an Int?
#define AVERAGE_SIZE 20 // is this a Float?
#define INVALID_PULSE_PERIOD -1 // done
You have to change your defines so they work in your Swift code.
Check this answer how to replace the Objective-C #define to make Swift-Workable.
Also you could just change the defines to variables and initialize your variables with them.
First, a bit on optionals. Variables that end with a '?' are Optional, meaning that they are allowed to be nil (basically not exist). The compiler will not know at compile time whether this variable exists or not, because you are allowed to set it to nil.
"Unary operator '++' cannot be applied to an operand of type 'Int?'"
You seem to have read that last word as Int, but it is Int? which is significant. Basically, since it is an optional (as indicated by the question mark), the compiler knows it can be nil. You cannot use ++ on nil, and since optionals can be nil, you cannot use ++ on optionals. You must forcibly unwrap it first:
downValIndex!++ //note the exclamation point for unwrapping
"Use of unresolved identifier '=-'"
=- isnt a thing. -= is a thing. So
downVals[downValIndex] -= newVal
downVals[downValIndex] = downVals[downValIndex]-newVal //equivalent to above
"Binary operator '>=' cannot be applied to an operand of type 'Int? and Float'"
The compiler thinks you have an optional int on the left of the < and a Float on the right. Assuming you want two Ints, you must unwrap the left and make sure the right is cast to be an int (something like this). If you want two floats instead, cast or define as floats instead of ints.
if downValIndex! >= averageSize as! Int { //casting to Int
You should just be defining averageSize as an int though
var averageSize:Int = 10 //or whatever number
Also, you have lots of optionals. If any of them can be defined to something at compile time, it will make your life easier as you won't need to unwrap them everywhere. Alternately you could implicitly unwrap them (only do this if you are absolutely sure they will never be nil).
var implicitlyUnwrappedOptional:Int!

Slow Swift Arrays and Strings performance

Here is two pretty similar Levenshtein Distance algorithms.
Swift implementation:
https://gist.github.com/bgreenlee/52d93a1d8fa1b8c1f38b
And Objective-C implementation:
https://gist.github.com/boratlibre/1593632
The swift one is dramatically slower then ObjC implementation
I've send couple of hours to make it faster but... It seems like Swift arrays and Strings manipulation are not as fast as objC.
On 2000 random Strings calculations Swift implementation is about 100(!!!) times slower then ObjC.
Honestly speaking, I've got no idea what could be wrong, coz even this part of swift
func levenshtein(aStr: String, bStr: String) -> Int {
// create character arrays
let a = Array(aStr)
let b = Array(bStr)
...
is few times slower then whole algorithm in Objective C
Is anyone knows how to speedup swift calculations?
Thank you in advance!
Append
After all suggested improvements swift code looks like this.
And it is 4 times slower then ObjC in release configuration.
import Foundation
class Array2D {
var cols:Int, rows:Int
var matrix:UnsafeMutablePointer<Int>
init(cols:Int, rows:Int) {
self.cols = cols
self.rows = rows
matrix = UnsafeMutablePointer<Int>(malloc(UInt(cols * rows) * UInt(sizeof(Int))))
for i in 0...cols*rows {
matrix[i] = 0
}
}
subscript(col:Int, row:Int) -> Int {
get {
return matrix[cols * row + col] as Int
}
set {
matrix[cols*row+col] = newValue
}
}
func colCount() -> Int {
return self.cols
}
func rowCount() -> Int {
return self.rows
}
}
extension String {
func levenshteinDistanceFromStringSwift(comparingString: NSString) -> Int {
let aStr = self
let bStr = comparingString
// let a = Array(aStr.unicodeScalars)
// let b = Array(bStr.unicodeScalars)
let a:NSString = aStr
let b:NSString = bStr
var dist = Array2D(cols: a.length + 1, rows: b.length + 1)
for i in 1...a.length {
dist[i, 0] = i
}
for j in 1...b.length {
dist[0, j] = j
}
for i in 1...a.length {
for j in 1...b.length {
if a.characterAtIndex(i-1) == b.characterAtIndex(j-1) {
dist[i, j] = dist[i-1, j-1] // noop
} else {
dist[i, j] = min(
dist[i-1, j] + 1, // deletion
dist[i, j-1] + 1, // insertion
dist[i-1, j-1] + 1 // substitution
)
}
}
}
return dist[a.length, b.length]
}
func levenshteinDistanceFromStringObjC(comparingString: String) -> Int {
let aStr = self
let bStr = comparingString
//It is really strange, but I should link Objective-C coz dramatic slow swift performance
return aStr.compareWithWord(bStr, matchGain: 0, missingCost: 1)
}
}
malloc?? NSString?? and at the end 4 times speed decrease? Is anybody needs swift anymore?
There are multiple reasons why the Swift code is slower than the Objective-C code.
I made a very simple test case by comparing two fixed strings 100 times.
Objective-C code: 0.026 seconds
Swift code: 3.14 seconds
The first reason is that a Swift Character represents an "extended grapheme cluster",
which can contain several Unicode code points (e.g. "flags"). This makes the
decomposition of a string into characters slow. On the other hand, Objective-C
NSString stores the strings as a sequence of UTF-16 code points.
If you replace
let a = Array(aStr)
let b = Array(bStr)
by
let a = Array(aStr.utf16)
let b = Array(bStr.utf16)
so that the Swift code works on UTF-16 sequences as well then the time goes down
to 1.88 seconds.
The allocation of the 2-dimensional array is also slow. It is faster to allocate
a single one-dimensional array. I found a simple Array2D class here:
http://blog.trolieb.com/trouble-multidimensional-arrays-swift/
class Array2D {
var cols:Int, rows:Int
var matrix: [Int]
init(cols:Int, rows:Int) {
self.cols = cols
self.rows = rows
matrix = Array(count:cols*rows, repeatedValue:0)
}
subscript(col:Int, row:Int) -> Int {
get {
return matrix[cols * row + col]
}
set {
matrix[cols*row+col] = newValue
}
}
func colCount() -> Int {
return self.cols
}
func rowCount() -> Int {
return self.rows
}
}
Using that class in your code
func levenshtein(aStr: String, bStr: String) -> Int {
let a = Array(aStr.utf16)
let b = Array(bStr.utf16)
var dist = Array2D(cols: a.count + 1, rows: b.count + 1)
for i in 1...a.count {
dist[i, 0] = i
}
for j in 1...b.count {
dist[0, j] = j
}
for i in 1...a.count {
for j in 1...b.count {
if a[i-1] == b[j-1] {
dist[i, j] = dist[i-1, j-1] // noop
} else {
dist[i, j] = min(
dist[i-1, j] + 1, // deletion
dist[i, j-1] + 1, // insertion
dist[i-1, j-1] + 1 // substitution
)
}
}
}
return dist[a.count, b.count]
}
the time in the test case goes down to 0.84 seconds.
The last bottleneck that I found in the Swift code is the min() function.
The Swift library has a built-in min() function which is faster. So just removing
the custom function from the Swift code reduces the time for the test case to
0.04 seconds, which is almost as good as the Objective-C version.
Addendum: Using Unicode scalars seems to be even slightly faster:
let a = Array(aStr.unicodeScalars)
let b = Array(bStr.unicodeScalars)
and has the advantage that it works correctly with surrogate pairs such
as Emojis.

Finding BST's height non-recursively?

This is a recursive method for finding the height, but i have a very large number of nodes in my binary search tree, and i want to find the height of the tree as well as assign the height to each individual sub-tree. So the recursive method throws stackoverflow exception, how do i do this non-recursively and without using stack?
private int FindHeight(TreeNode node)
{
if (node == null)
{
return -1;
}
else
{
node.Height = 1 + Math.Max(FindHeight(node.Left), FindHeight(node.Right));
return node.Height;
}
}
I believe i have to use post order traversal but without stack?
I was able to make this method, and it does return the correct height but it assigns each node with its depth not height.
public void FindHeight()
{
int maxHeight = 0;
Queue<TreeNode> Q = new Queue<TreeNode>();
TreeNode node;
Q.Enqueue(Root);
while (Q.Count != 0)
{
node = Q.Dequeue();
int nodeHeight = node.Height;
if (node.Left != null)
{
node.Left.Height = nodeHeight + 1;
Q.Enqueue(node.Left);
}
if (node.Right != null)
{
node.Right.Height = nodeHeight + 1;
Q.Enqueue(node.Right);
}
if (nodeHeight > maxHeight)
{
maxHeight = nodeHeight;
}
}
Console.WriteLine(maxHeight);
}