In previous project I had this function defined in AppDeligate.m and it was globally available to all the parts of the app:
#define UIColorFromRGB(rgbValue) [UIColor \
colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
Now for some reason when I place this code in a new project, only the specific class can see the function, but the child classes do not see it.
How do I make this function globally available?
You can define your macro in the .pch file of your project, but it's generally a bad idea, since it will produce code which is difficult to read, maintain and debug.
In your specific case I'd rather create a category on UIColor.
Here's a implementation with extra stuff which may come in handy.
UIColor+Extra.h
#import <UIKit/UIKit.h>
#interface UIColor (Extra)
+ (instancetype)extra_colorWith255BasedRed:(NSUInteger)red green:(NSUInteger)green blue:(NSUInteger)blue;
+ (instancetype)extra_colorWith255BasedRed:(NSUInteger)red green:(NSUInteger)green blue:(NSUInteger)blue alpha:(CGFloat)alpha;
+ (instancetype)extra_colorWithHex:(NSInteger)hex;
#end
UIColor+Extra.m
#import "UIColor+Extra.h"
#implementation UIColor (Extra)
+ (instancetype)extra_colorWith255BasedRed:(NSUInteger)red green:(NSUInteger)green blue:(NSUInteger)blue {
return [self colorWith255BasedRed:red green:green blue:blue alpha:1.0];
}
+ (instancetype)extra_colorWith255BasedRed:(NSUInteger)red green:(NSUInteger)green blue:(NSUInteger)blue alpha:(CGFloat)alpha {
return [self colorWithRed:red/255.0f green:green/255.0f blue:blue/255.0f alpha:alpha];
}
+ (instancetype)extra_colorWithHex:(NSInteger)hex {
return [UIColor colorWith255BasedRed:((hex & 0xFF0000) >> 16) green:((hex & 0xFF00) >> 8) blue:(hex & 0xFF)];
}
#end
Then just place
#import "UIColor+Extra.h"
in your .pch file and use it elsewhere.
Examples
UIColor * cyan = [UIColor extra_colorWithHex:0x00FFFF];
UIColor * magenta = [UIColor extra_colorWith255BasedRed:255 green:0 blue:255];
If you add the file to YOURPROJECTNAME-Prefix.pch, it will be added as a precompiled header, and available in all your code-files.
Audun Kjelstrup has the best answer.
You also have the option to setup a static method inside a class.
+ (UIColor *)UIColorFromRGB {}
I agree with Gabriele about macros (undesirable as functions) and that what you are trying to do would be nice/appropriate in a category. However, I also want to point out that Objective-C can also do C-style functions (as opposed to the macro that you have shown).
For example:
// Header
UIColor * UIColorFromRGB(NSInteger rgbValue);
// Implementation
UIColor * UIColorFromRGB(NSInteger rgbValue)
{
return [UIColor colorWithRed: ((float)((rgbValue & 0xFF0000) >> 16))/255.0
green: ((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue: ((float)(rgbValue & 0xFF))/255.0 alpha:1.0];
}
Related
I am making an iOS drawing app, and I want to store specific colors in a txt file and then have the file read into an array. So far I have gotten an array to read a file of hex color values, but I do not know how to use the hex values. What I want to be able to do is choose a color by referencing a certain index in the array.
For example
CGContextSetStrokeColorWithColor(context, colors[2]);
I have not found anything that would work for my app.
Its really simple, do something like this
+ (UIColor *)colorWithHexString:(NSString *)hexColorString alpha:(CGFloat)alpha {
unsigned colorValue = 0;
NSScanner *valueScanner = [NSScanner scannerWithString:hexColorString];
if ([hexColorString rangeOfString:#"#"].location != NSNotFound) [valueScanner setScanLocation:1];
[valueScanner scanHexInt:&colorValue];
return [UIColor colorWithRed:((colorValue & 0xFF0000) >> 16)/255.0 green:((colorValue & 0xFF00) >> 8)/255.0 blue:((colorValue & 0xFF) >> 0)/255.0 alpha:alpha];
}
Explanation: let's take a hex color for an example #FFEBCD where the # is an optional. Simply Red: FF Green: EB and Blue: CD so use right-shit operator to filter the positions and pass it to a UIColor factory method.
There is a nice UIColor Extension in github you can try that too. I hope which might be really useful to your App.
I've in a database a few rows where one field is ARGB value for the related color.
I've to read all the rows of these table and convert the ARGB value decimal to a UIColor.
I've googled finding this, but I didn't.
Is there any way to approach this?
Thanks.
Here's the method I came up with to convert an ARGB integer into a UI color. Tested it on several colors we have in our .NET system
+(UIColor *)colorFromARGB:(int)argb {
int blue = argb & 0xff;
int green = argb >> 8 & 0xff;
int red = argb >> 16 & 0xff;
int alpha = argb >> 24 & 0xff;
return [UIColor colorWithRed:red/255.f green:green/255.f blue:blue/255.f alpha:alpha/255.f];
}
text.color = [UIColor colorWithRed:10.0/255.0 green:100.0/255.0 blue:55.0/255.0 alpha:1];
You just need to divide the RGB values by 255 to get things to set up correctly.
you can define a macro and use it throughout your code
#define UIColorFromARGB(rgbValue) [UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:((float)((rgbValue & 0xFF000000) >> 24))/255.0]
It's 2021, we're all using Swift ;) So here's a Swift extension to solve this:
extension UIColor {
/* Converts an AARRGGBB into a UIColor like Android Color.parseColor */
convenience init?(hexaARGB: String) {
var chars = Array(hexaARGB.hasPrefix("#") ? hexaARGB.dropFirst() : hexaARGB[...])
switch chars.count {
case 3: chars = chars.flatMap { [$0, $0] }; fallthrough
case 6: chars.append(contentsOf: ["F","F"])
case 8: break
default: return nil
}
self.init(red: .init(strtoul(String(chars[2...3]), nil, 16)) / 255,
green: .init(strtoul(String(chars[4...5]), nil, 16)) / 255,
blue: .init(strtoul(String(chars[6...7]), nil, 16)) / 255,
alpha: .init(strtoul(String(chars[0...1]), nil, 16)) / 255)
}
}
Usage
UIColor(hexaARGB: "#CCFCD204")
I wrote up some small macros to help in setting application-wide font styles, colors and gradients. The issue is that xcode is throwing warnings whenever I use the color or font defined in the macro. Is there any way to get xcode to validate the macro, and hopefully get code hinting as well?
#define HEX_COLOR(colorname, hexcolor) \
\
#implementation UIColor(Style##colorname) \
\
+ (UIColor*) colorname \
{ \
static UIColor * staticColor = nil; \
if(! staticColor) { \
float red = ((hexcolor & 0xFF0000) >> 16)/255.0; \
float green = ((hexcolor & 0xFF00) >> 8)/255.0; \
float blue = (hexcolor & 0xFF)/255.0; \
float alpha = (hexcolor >> 24)/255.0; \
\
staticColor = [[UIColor colorWithRed:red \
green:green \
blue:blue \
alpha:alpha] retain]; \
} \
return staticColor; \
} \
#end
In my app delegate i set the application-wide fonts and colors like this:
HEX_COLOR(specialGreenColor, 0x66BAD455);
.....
This line throws the warning that the property might not exist.
[[UIColor specialGreenColor] set];
Also I do not want to lessen the error reporting in xcode as not seeing warning is a backwards step. I just would like to find a way to regiester the marco with xcode.
If you want to define constants, you should use #define like so:
#define COLOUR_NAME 0x66BAD455
Then, the preprocessor will go through your file and replace all instances of COLOUR_NAME verbatim with 0x66BAD455.
There are arguably better ways to define application-wide constants though.
Edit: there's also this nice post which seems to provide a better implementation of what you're going for. You can define the macro and then define your colour constants using the question linked above.
Code hinting for the marco works in xCode 4.2
Like the title says I want to use a color code instead of doing something like this
lblTemp.textColor = [UIColor colorWithRed: 0 green:0x99/255.0 blue:0 alpha:1.0];
for instance I have the following colorcode #30ae36 how can I use this one instead of doing the above.
I use this handy macro:
#define UIColorFromRGB(rgbValue) [UIColor \
colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
(I might have gotten it from here, but you can find it in many places. In any case it's worth reading the discussion about macros vs inline functions at this link.)
If you want to initialize the color from a string (for example from a plist), you can use this:
unsigned rgbValues;
[[NSScanner scannerWithString:#"0xFF0000"] scanHexInt: &rgbValues];
UIColor* redColor = UIColorFromRGB(rgbValues);
UIColor doesn't offer a default method for that, but, you could create a so called category for UIColor which takes the hex value (sans the #) and turns it in to the corresponding RGB components and uses those components to feed to UIColor's colorWithRed:green:blue:alpha:.
this should do the trick for this code #30ae36
lblTemp4.textColor = [UIColor colorWithRed:0x*30*/255.0 green:0x*AE*/255.0 blue:0x*36*/255.0 alpha:1.0];
In the previous answer, from jovany the * * stars should be omitted. So #30ae36 becomes:
lblTemp4.textColor = [UIColor colorWithRed:0x30/255.0 green:0xAE/255.0 blue:0x36/255.0 alpha:1.0];
(Maybe obvious, but I didnĀ“t get it at first sight)
I have this macro in my header file:
#define UIColorFromRGB(rgbValue) \
[UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 \
alpha:1.0]
And I am using this as something like this in my .m file:
cell.textColor = UIColorFromRGB(0x663333);
So I want to ask everyone is this better or should I use this approach:
cell.textColor = [UIColor colorWithRed:66/255.0
green:33/255.0
blue:33/255.0
alpha:1.0];
Which one is the better approach?
or create a separate category, so you only need to import one .h file:
#interface UIColor (util)
+ (UIColor *) colorWithHexString:(NSString *)hex;
+ (UIColor *) colorWithHexValue: (NSInteger) hex;
#end
and
#import "UIColor-util.h"
#implementation UIColor (util)
// Create a color using a string with a webcolor
// ex. [UIColor colorWithHexString:#"#03047F"]
+ (UIColor *) colorWithHexString:(NSString *)hexstr {
NSScanner *scanner;
unsigned int rgbval;
scanner = [NSScanner scannerWithString: hexstr];
[scanner setCharactersToBeSkipped:[NSCharacterSet characterSetWithCharactersInString:#"#"]];
[scanner scanHexInt: &rgbval];
return [UIColor colorWithHexValue: rgbval];
}
// Create a color using a hex RGB value
// ex. [UIColor colorWithHexValue: 0x03047F]
+ (UIColor *) colorWithHexValue: (NSInteger) rgbValue {
return [UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0
green:((float)((rgbValue & 0xFF00) >> 8))/255.0
blue:((float)(rgbValue & 0xFF))/255.0
alpha:1.0];
}
#end
How about creating your own:
#define RGB(r, g, b) \
[UIColor colorWithRed:(r)/255.0 green:(g)/255.0 blue:(b)/255.0 alpha:1]
#define RGBA(r, g, b, a) \
[UIColor colorWithRed:(r)/255.0 green:(g)/255.0 blue:(b)/255.0 alpha:(a)]
Then use it:
cell.textColor = RGB(0x66, 0x33, 0x33);
Seems simple enough to use, uses hex values for colors and without needing additional calculation overhead.
A middle ground might be your best option. You could define either a regular C or objective-C function to do what your macro is doing now:
// As a C function:
UIColor* UIColorFromRGB(NSInteger rgbValue) {
return [UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0
green:((float)((rgbValue & 0xFF00) >> 8))/255.0
blue:((float)(rgbValue & 0xFF))/255.0
alpha:1.0];
}
// As an Objective-C function:
- (UIColor *)UIColorFromRGB:(NSInteger)rgbValue {
return [UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0
green:((float)((rgbValue & 0xFF00) >> 8))/255.0
blue:((float)(rgbValue & 0xFF))/255.0
alpha:1.0];
}
If you decide to stick with the macro, though, you should put parentheses around rgbValue wherever it appears. If I decide to call your macro with:
UIColorFromRGB(0xFF0000 + 0x00CC00 + 0x000099);
you may run into trouble.
The last bit of code is certainly the most readable, but probably the least portable - you can't call it simply from anywhere in your program.
All in all, I'd suggest refactoring your macro into a function and leaving it at that.
I typically recommend functions rather than complex #defines. If inlining has a real benefit, the compiler will generally do it for you. #defines make debugging difficult, particularly when they're complex (and this one is).
But there's nothing wrong with using a function here. The only nitpick I'd say is that you should be using CGFloat rather than float, but there's nothing wrong with the hex notation if it's more comfortable for you. If you have a lot of these, I can see where using Web color notation may be convenient. But avoid macros.
I.m.h.o the UIcolor method is more readable. I think macro's are great if they solve a problem; i.e. provide more performance and/or readable code.
It is not clear to me what the advantage of using a macro is in this case, so I'd prefer the second option.
Keep in mind that 33 != 0x33. The first is decimal notation and the second is hexadecimal. They're both valid, but they are different. Your second option should read
cell.textColor = [UIColor colorWithRed:0x66/255.0
green:0x33/255.0
blue:0x33/255.0
alpha:1.0];
or
cell.textColor = [UIColor colorWithRed:102/255.0
green:51/255.0
blue:51/255.0
alpha:1.0];
Nice Marius, but to compile I had to get rid of the parenthesis, as follows (otherwise, Objective C takes it literally and you get a syntax compilation error:
#define RGB(r,g,b) [UIColor colorWithRed:r/255.0 green:g/255.0 blue:b/255.0 alpha:1.0]
...
NSArray *palette;
...
palette = [NSArray arrayWithObjects:
RGB(0,0,0),
RGB(255,0,0), // red
...