is there a point to "optimize" types on iOS devices? [closed] - objective-c

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Here I was just writing some code that dealt with an integer value of -24/+24, and I made my method return an int... And I thought to myself-- should I really be using a short in this case? I know it might have mattered back in the day when memory for something was 48k-- but in today's modern world does it really matter?
Is it ok to just be "int happy", even when I know my numbers are going to be very small?

All ARM CPUs have 32-bit integer registers and at least a 32-bit wide L1 bus, so using a short will give absolutely no advantage [1], and may in some cases be detrimental to performance.
Leave the variable as an int and be safe in the knowledge that you'll get an optimum register width pretty much wherever you run the code.
[1] The exception to this rule being when using the NEON unit - in which case a 16-bit operation offers more parallelism than a 32-bit operation.

Related

Counting Kernels of Corn [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'm looking for some guidance here. I primarily am a frontend developer. What I am trying to figure out is how an algorithm can be implemented to count kernels on an ear of corn.
From my initial research it seems there are a couple of different directions to go. Main ones I have seen are a SIRF type of implementation and others call for conversion to the HSV color space or LAB color space in order to then to normalizations and then counting.
For reference usually the corn that will be counted is "dent" corn. Here is an example:
This will be implemented in VB.net, but I can always translate the algorithm if needed.
Thank you for your help!

Programming language without integer type [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
What would be shortcomings of a language without integer type (only float)? Which of them will be very serious or unsolvable? Suppose compiler is sufficiently smart to print rounded number when it knows, that this number is really integer. Was such idea discussed in scientific papers or implemented in some language (I cannot find neither).
Javascript is such a language: all numbers are double-precision (64-bit) floating point. There are a number of unfortunate consequences of this design choice, but clearly the result can be made to work...
Note that double-precision floating point can precisely represent integers of up to 53 bits. So, it can be more practical than you might think to use it as an integer-substitute.
I believe that the problem in question is in two points, which are:
The "float" data require more processor, while data "integer" less relative to "float"; Therefore depending on the size of the program can occur overload.
The other point would be the main memory space used, following the same as the processor where the data "float" require more space than the data "integer".
I hope to have contributed to answer your question.
There's relatively new proposal for decimal floating point number http://dec64.com/ . It would be really interesting to see a language supporting it natively.
As for shortcomings of only using floats, nothing which would be a real problem comes to mind. Some operations would be strange, as it is case in JavaScript and things like bit twiddling.

Getting Mac system temperatures with Objective-C [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
How do I access the system temperatures and fan speeds of a Mac using Objective-C? I have seen it done in applications like iStat, but I can not figure out how to do this. Does anyone know how?
You may also take a look at https://github.com/fmorrow/SMCWrapper - it's an object-oriented version of smc.c/smc.h. (The code of SMCWrapper is well commented)
Currently, it opens a connection using IOKit to AppleSMC IOService, and uses it to makes calls to the SMC chip. You can read keys to NSString, but setting key value is experimental at this time.
Take a look at https://github.com/lavoiesl/osx-cpu-temp/blob/master/smc.c for sample code that reads the temps from the SMC.

Is using many variables good practice? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I always tend to declare a new variable every time I have some heavy treatment(e.g. mathematical operation) to store the result within.
Is it a good or a bad practice?
Well, one of my first programs (solving a quadratic equation) had 30 variables and I felt so proud that I managed it lol. However too many variables which are unnecessary will kill the readability big time and you're going to start having troubles.
I'd say it's a double-edged sword. Use too many and you fail, use to little and errors start to happen.
Aim for making it neat so you have minimal troubles locating yourself and figuring out what's happening.
Good luck!
:)

Are CRC cards still used for designing your system? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
CRC cards are known as one of the simple, intuitive method
to simulate your system before creating it.
Many people praises its goodness with a few of criticism
but I could not find well, solid examples about its actual usage
or good case studies.
YouTube only provides two direct examples how CRC methods
are used - both of them are played in not american, even the
creators of the method are two great american ^^.
How funny?
So here I want to know how many people actually uses CRC in design session?
Is it still valid or great? Is it worth to investigate, practice and put in many hours?
My guess is that this is largely replaced by UML. Never heard of anyone using CRC, but then again I'm more in the web than corporate development..