Get Slightly Lighter and Darker Color from UIColor - objective-c

I was looking to be able to turn any UIColor into a gradient. The way I am intending to do this is by using Core Graphics to draw a gradient. What I am trying to do is to get a color, lets say:
[UIColor colorWithRed:0.5 green:0.5 blue:0.5 alpha:1.0];
and get a UIColor which is a few shades darker and a few shades lighter. Does anyone know how to do this? Thank you.

- (UIColor *)lighterColorForColor:(UIColor *)c
{
CGFloat r, g, b, a;
if ([c getRed:&r green:&g blue:&b alpha:&a])
return [UIColor colorWithRed:MIN(r + 0.2, 1.0)
green:MIN(g + 0.2, 1.0)
blue:MIN(b + 0.2, 1.0)
alpha:a];
return nil;
}
- (UIColor *)darkerColorForColor:(UIColor *)c
{
CGFloat r, g, b, a;
if ([c getRed:&r green:&g blue:&b alpha:&a])
return [UIColor colorWithRed:MAX(r - 0.2, 0.0)
green:MAX(g - 0.2, 0.0)
blue:MAX(b - 0.2, 0.0)
alpha:a];
return nil;
}
Use it like this:
UIColor *baseColor = // however you obtain your color
UIColor *lighterColor = [self lighterColorForColor:baseColor];
UIColor *darkerColor = [self darkerColorForColor:baseColor];
EDIT: as #Anchu Chimala pointed out, for maximum flexibility, these methods should be implemented as an UIColor category. Also, from #Riley's idea, it may be a better idea to make the color proprtionally darker or lighter instead of adding or subtracting constant values. As #jrturton pointed out, it's not necessary to manipulate the RGB components; it's better to modify the brightness property itself. All in all:
#implementation UIColor (LightAndDark)
- (UIColor *)lighterColor
{
CGFloat h, s, b, a;
if ([self getHue:&h saturation:&s brightness:&b alpha:&a])
return [UIColor colorWithHue:h
saturation:s
brightness:MIN(b * 1.3, 1.0)
alpha:a];
return nil;
}
- (UIColor *)darkerColor
{
CGFloat h, s, b, a;
if ([self getHue:&h saturation:&s brightness:&b alpha:&a])
return [UIColor colorWithHue:h
saturation:s
brightness:b * 0.75
alpha:a];
return nil;
}
#end

TL;DR:
Swift:
extension UIColor {
var lighterColor: UIColor {
return lighterColor(removeSaturation: 0.5, resultAlpha: -1)
}
func lighterColor(removeSaturation val: CGFloat, resultAlpha alpha: CGFloat) -> UIColor {
var h: CGFloat = 0, s: CGFloat = 0
var b: CGFloat = 0, a: CGFloat = 0
guard getHue(&h, saturation: &s, brightness: &b, alpha: &a)
else {return self}
return UIColor(hue: h,
saturation: max(s - val, 0.0),
brightness: b,
alpha: alpha == -1 ? a : alpha)
}
}
Usage:
let lightColor = somethingDark.lighterColor
Objective-C:
- (UIColor *)lighterColorRemoveSaturation:(CGFloat)removeS
resultAlpha:(CGFloat)alpha {
CGFloat h,s,b,a;
if ([self getHue:&h saturation:&s brightness:&b alpha:&a]) {
return [UIColor colorWithHue:h
saturation:MAX(s - removeS, 0.0)
brightness:b
alpha:alpha == -1? a:alpha];
}
return nil;
}
- (UIColor *)lighterColor {
return [self lighterColorRemoveSaturation:0.5
resultAlpha:-1];
}
#rchampourlier was right in his comment to #user529758 (The accepted answer) - The HSB (Or HSV) and RGB solutions give completely different results. RGB just adds (Or makes the color closer to) white, and the HSB solution brings the color closer to the edge in the Brigtness scale - which basically start with black and ends with the pure color...
Basically Brightness (Value) makes the color less or more closer to black, where Saturation makes it less or more closer to white...
As seen here:
So the solution to make a color actually brighter (i.e. closer to white...) will be to make it's Saturation value smaller, resulting this solution:
- (UIColor *)lighterColor {
CGFloat h,s,b,a;
if ([self getHue:&h saturation:&s brightness:&b alpha:&a]) {
return [UIColor colorWithHue:h
saturation:MAX(s - 0.3, 0.0)
brightness:b /*MIN(b * 1.3, 1.0)*/
alpha:a];
}
return nil;
}

Swift universal extension for iOS and OS X, using getHue :
#if os(OSX)
import Cocoa
public typealias PXColor = NSColor
#else
import UIKit
public typealias PXColor = UIColor
#endif
extension PXColor {
func lighter(amount : CGFloat = 0.25) -> PXColor {
return hueColorWithBrightnessAmount(1 + amount)
}
func darker(amount : CGFloat = 0.25) -> PXColor {
return hueColorWithBrightnessAmount(1 - amount)
}
private func hueColorWithBrightnessAmount(amount: CGFloat) -> PXColor {
var hue : CGFloat = 0
var saturation : CGFloat = 0
var brightness : CGFloat = 0
var alpha : CGFloat = 0
#if os(iOS)
if getHue(&hue, saturation: &saturation, brightness: &brightness, alpha: &alpha) {
return PXColor( hue: hue,
saturation: saturation,
brightness: brightness * amount,
alpha: alpha )
} else {
return self
}
#else
getHue(&hue, saturation: &saturation, brightness: &brightness, alpha: &alpha)
return PXColor( hue: hue,
saturation: saturation,
brightness: brightness * amount,
alpha: alpha )
#endif
}
}
Usage :
let color = UIColor(red: 0.5, green: 0.8, blue: 0.8, alpha: 1.0)
color.lighter(amount:0.5)
color.darker(amount:0.5)
OR (with the default values):
color.lighter()
color.darker()
Sample :

I just wanted to give the same result, in RGB, than
placing the color with alpha x% over a white background to lighten
placing the color with alpha x% over a black background to darken
Which gives the same result, AFAIK, than picking the color in a gradient 'color to white' or 'color to black', at x% of the gradient size.
For that purpose, the math is simple:
extension UIColor {
func mix(with color: UIColor, amount: CGFloat) -> UIColor {
var red1: CGFloat = 0
var green1: CGFloat = 0
var blue1: CGFloat = 0
var alpha1: CGFloat = 0
var red2: CGFloat = 0
var green2: CGFloat = 0
var blue2: CGFloat = 0
var alpha2: CGFloat = 0
getRed(&red1, green: &green1, blue: &blue1, alpha: &alpha1)
color.getRed(&red2, green: &green2, blue: &blue2, alpha: &alpha2)
return UIColor(
red: red1 * (1.0 - amount) + red2 * amount,
green: green1 * (1.0 - amount) + green2 * amount,
blue: blue1 * (1.0 - amount) + blue2 * amount,
alpha: alpha1
)
}
}
Here are examples with some colors

user529758's solution in Swift:
Darker color:
func darkerColorForColor(color: UIColor) -> UIColor {
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if color.getRed(&r, green: &g, blue: &b, alpha: &a){
return UIColor(red: max(r - 0.2, 0.0), green: max(g - 0.2, 0.0), blue: max(b - 0.2, 0.0), alpha: a)
}
return UIColor()
}
Lighter color:
func lighterColorForColor(color: UIColor) -> UIColor {
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if color.getRed(&r, green: &g, blue: &b, alpha: &a){
return UIColor(red: min(r + 0.2, 1.0), green: min(g + 0.2, 1.0), blue: min(b + 0.2, 1.0), alpha: a)
}
return UIColor()
}

If you convert the RGB color to the HSL color model then you can vary the L = lightness component from L = 0.0 (black) over L = 0.5 (natural color) to L = 1.0 (white) . UIColor cannot handle HSL directly, but there are formula for converting RGB <-> HSL.

All other answers in this thread use either the RGB color system or simply change the hue or brightness value of the HSB system. As explained in detail in this great blog post the correct way of making a color lighter or darker is to change its luminance value. None of the other answers does that. If you want to do it right, then use my solution or write your own after reading the blog post.
Unfortunately it's quite a hassle to change any of the attributes of a UIColor by default. Also Apple doesn't even support any LAB-based color space like HCL in the UIColor class (the L in LAB is the luminance value we are looking for).
Using HandyUIKit (install it via Carthage) adds support for HCL and makes your life a lot easier:
import HandyUIKit
let color = UIColor(red: 0.5, green: 0.5, blue: 0.5, alpha: 1.0)
// create a new UIColor object with a specific luminance (slightly lighter)
color.change(.luminance, to: 0.7)
There is also an option to apply a relative change (recommended):
// create a new UIColor object with slightly darker color
color.change(.luminance, by: -0.2)
Note that HandyUIKit also adds some other handy UI features into your project – checkout its README on GitHub for more details.
I hope it helps!
Disclaimer: I'm the author of HandyUIKit.

None of the solutions posted quite worked for all colours and shades, but then I stumbled across this library which provides a set of very well implemented extensions to UIColor.
Specifically it has a lighten function as part of its HSL implementation: (UIColor *)lighten:(CGFloat)amount - which works perfectly.

Sebyddd solution as an extension:
extension UIColor {
func darker() -> UIColor {
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if self.getRed(&r, green: &g, blue: &b, alpha: &a){
return UIColor(red: max(r - 0.2, 0.0), green: max(g - 0.2, 0.0), blue: max(b - 0.2, 0.0), alpha: a)
}
return UIColor()
}
func lighter() -> UIColor {
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if self.getRed(&r, green: &g, blue: &b, alpha: &a){
return UIColor(red: min(r + 0.2, 1.0), green: min(g + 0.2, 1.0), blue: min(b + 0.2, 1.0), alpha: a)
}
return UIColor()
}
}
Usage:
let darkerYellow = UIColor.yellow.darker()
let lighterYellow = UIColor.yellow.lighter()

Swift 5
extension UIColor {
func lighter(by percentage:CGFloat=30.0) -> UIColor? {
return self.adjust(by: abs(percentage) )
}
func darker(by percentage:CGFloat=30.0) -> UIColor? {
return self.adjust(by: -1 * abs(percentage) )
}
func adjust(by percentage:CGFloat=30.0) -> UIColor? {
var r:CGFloat=0, g:CGFloat=0, b:CGFloat=0, a:CGFloat=0;
if self.getRed(&r, green: &g, blue: &b, alpha: &a) {
return UIColor(red: min(r + percentage/100, 1.0),
green: min(g + percentage/100, 1.0),
blue: min(b + percentage/100, 1.0),
alpha: a)
} else {
return nil
}
}
}

If you want user529758's solution to work with gray shades (like [UIColor lightGrayColor] or [UIColor darkGrayColor] you have to improve it like that:
- (UIColor *)lighterColor
{
CGFloat h, s, b, a;
if ([self getHue:&h saturation:&s brightness:&b alpha:&a]) {
return [UIColor colorWithHue:h
saturation:s
brightness:MIN(b * 1.3, 1.0)
alpha:a];
}
CGFloat white, alpha;
if ([self getWhite:&white alpha:&alpha]) {
white = MIN(1.3*white, 1.0);
return [UIColor colorWithWhite:white alpha:alpha];
}
return nil;
}
getHue:saturation:brightness:alpha fails (and returns false) when called on a gray shade therefore you'll need to use getWhite:alpha.

UIColor extension and fixing lighterColorForColor
extension UIColor {
class func darkerColorForColor(color: UIColor) -> UIColor {
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if color.getRed(&r, green: &g, blue: &b, alpha: &a){
return UIColor(red: max(r - 0.2, 0.0), green: max(g - 0.2, 0.0), blue: max(b - 0.2, 0.0), alpha: a)
}
return UIColor()
}
class func lighterColorForColor(color: UIColor) -> UIColor {
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if color.getRed(&r, green: &g, blue: &b, alpha: &a){
let tmpColor = UIColor(red: min(r + 0.2, 1.0), green: min(g + 0.2, 1.0), blue: min(b + 0.2, 1.0), alpha: a)
println(tmpColor)
return tmpColor
}
return UIColor()
}
}

I'm not sure if you're looking for some sort of Objective-C answer, but based on how colors specified by RGBA work, I think you can simply scale the RGB values according to an arbitrary factor to get a "lighter" or "darker" shade. For example, you might have a blue:
[UIColor colorWithRed:0.0 green:0.0 blue:1.0 alpha:1.0];
Want a darker blue? Multiply the RGB values by 0.9:
[UIColor colorWithRed:0.0 green:0.0 blue:0.9 alpha:1.0];
Voila. Or maybe you have an orange:
[UIColor colorWithRed:1.0 green:0.4 blue:0.0 alpha:1.0];
Choose another scale factor, say, 0.8:
[UIColor colorWithRed:0.8 green:0.32 blue:0.0 alpha:1.0];
Is that the sort of effect you're looking for?

Tested in Xcode 10 with Swift 4.x for iOS 12
Start with your color as a UIColor and pick a darkening factor (as a CGFloat)
let baseColor = UIColor.red
let darkenFactor: CGFloat = 2
The type CGColor has an optional value components which break down the color into RGBA (as a CGFloat array with values between 0 and 1). You can then reconstruct a UIColor using RGBA values taken from the CGColor and manipulate them.
let darkenedBase = UIColor(displayP3Red: startColor.cgColor.components![0] / darkenFactor, green: startColor.cgColor.components![1] / darkenFactor, blue: startColor.cgColor.components![2] / darkenFactor, alpha: 1)
In this example, each of the RGB valuse were divided by 2, making the color half as dark as it was before. The alpha value remained the same, but you could alternatively apply the darken factor on the alpha value rather than the RGB.

Ideally, the functions should be encapsulated inside a UIColor extension called, UIColor+Brightness.swift, and have a configurable brightness - see example below:
import UIKit
extension UIColor {
func lighterColorWithBrightnessFactor(brightnessFactor:CGFloat) -> UIColor {
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if self.getRed(&r, green:&g, blue:&b, alpha:&a) {
return UIColor(red:min(r + brightnessFactor, 1.0),
green:min(g + brightnessFactor, 1.0),
blue:min(b + brightnessFactor, 1.0),
alpha:a)
}
return UIColor()
}
}

I render coloured cells based on a status value:
For this I wrote a swift extension based on some old objc code after I got an error using CryingHippo's suggestion:
extension UIColor{
func darker(darker: CGFloat) -> UIColor{
var red: CGFloat = 0.0
var green: CGFloat = 0.0
var blue: CGFloat = 0.0
if self.colorSpace == UIColorSpace.genericGrayColorSpace(){
red = whiteComponent - darker
green = whiteComponent - darker
blue = whiteComponent - darker
} else {
red = redComponent - darker
green = greenComponent - darker
blue = blueComponent - darker
}
if red < 0{
green += red/2
blue += red/2
}
if green < 0{
red += green/2
blue += green/2
}
if blue < 0{
green += blue/2
red += blue/2
}
return UIColor(
calibratedRed: red,
green: green,
blue: blue,
alpha: alphaComponent
)
}
func lighter(lighter: CGFloat) -> UIColor{
return darker(-lighter)
}
}
The same works for NSColor as well. Simply replace UIColor with NSColor.

Here is a UIColor category that also allows control over the amount of color change.
- (UIColor *)lighterColorWithDelta:(CGFloat)delta
{
CGFloat r, g, b, a;
if ([self getRed:&r green:&g blue:&b alpha:&a])
return [UIColor colorWithRed:MIN(r + delta, 1.0)
green:MIN(g + delta, 1.0)
blue:MIN(b + delta, 1.0)
alpha:a];
return nil;
}
- (UIColor *)darkerColorWithDelta:(CGFloat)delta
{
CGFloat r, g, b, a;
if ([self getRed:&r green:&g blue:&b alpha:&a])
return [UIColor colorWithRed:MAX(r - delta, 0.0)
green:MAX(g - delta, 0.0)
blue:MAX(b - delta, 0.0)
alpha:a];
return nil;
}

A Swift extension based on #Sebyddd answer:
import Foundation
import UIKit
extension UIColor{
func colorWith(brightness: CGFloat) -> UIColor{
var r:CGFloat = 0, g:CGFloat = 0, b:CGFloat = 0, a:CGFloat = 0
if getRed(&r, green: &g, blue: &b, alpha: &a){
return UIColor(red: max(r + brightness, 0.0), green: max(g + brightness, 0.0), blue: max(b + brightness, 0.0), alpha: a)
}
return UIColor()
}
}

for darker color, this is the simplest:
theColor = [theColor shadowWithLevel:s]; //s:0.0 to 1.0

Related

Why is the background color of my UIButton not affected in state config?

I am trying to configure UIButton disabled state using attribute inspector 'state config.' How can I configure the background color for the disabled state?
Try this:
#IBAction func buttonStateChanged(sender: UIButton) {
if(sender.selected){
sender.backgroundColor = UIColor(red: 1.0, green: 0.0, blue: 0.2, alpha: 1.0)
}else{
sender.backgroundColor = UIColor(red: 1.0, green: 0.0, blue: 0.2, alpha: 1.0)
}
}

How to add a gradient tint color to a UISlider in XCode 6?

I'm working on a design application that has a section for selecting colors by three sliders for RGB.
As we can see in xcode, where we want to select a color by RGB values, the slider tint color is a gradient color that changes when we change the sliders. I want to use this in my application. but I have no idea about how to do this?
I've found this code in a blog. but didn't work for me.
- (void)setGradientToSlider:(UISlider *)Slider WithColors:(NSArray *)Colors{
UIView * view = (UIView *)[[Slider subviews]objectAtIndex:0];
UIImageView * maxTrackImageView = (UIImageView *)[[view subviews]objectAtIndex:0];
CAGradientLayer * maxTrackGradient = [CAGradientLayer layer];
CGRect rect = maxTrackImageView.frame;
rect.origin.x = view.frame.origin.x;
maxTrackGradient.frame = rect;
maxTrackGradient.colors = Colors;
[maxTrackGradient setStartPoint:CGPointMake(0.0, 0.5)];
[maxTrackGradient setEndPoint:CGPointMake(1.0, 0.5)];
[[maxTrackImageView layer] insertSublayer:maxTrackGradient atIndex:0];
/////////////////////////////////////////////////////
UIImageView * minTrackImageView = (UIImageView *)[[view subviews]objectAtIndex:1];
CAGradientLayer * minTrackGradient = [CAGradientLayer layer];
rect = minTrackImageView.frame;
rect.size.width = maxTrackImageView.frame.size.width;
rect.origin.x = 0;
rect.origin.y = 0;
minTrackGradient.frame = rect;
minTrackGradient.colors = Colors;
[minTrackGradient setStartPoint:CGPointMake(0.0, 0.5)];
[minTrackGradient setEndPoint:CGPointMake(1.0, 0.5)];
[minTrackImageView.layer insertSublayer:minTrackGradient atIndex:0];
}
I would appreciate any helps. Thanks.
While it didnt give me the desired results here is a down and dirty Swift version of the answer above for those that want to try it.
func setSlider(slider:UISlider) {
let tgl = CAGradientLayer()
let frame = CGRectMake(0, 0, slider.frame.size.width, 5)
tgl.frame = frame
tgl.colors = [UIColor.blueColor().CGColor, UIColor.greenColor().CGColor, UIColor.yellowColor().CGColor, UIColor.orangeColor().CGColor, UIColor.redColor().CGColor]
tgl.startPoint = CGPointMake(0.0, 0.5)
tgl.endPoint = CGPointMake(1.0, 0.5)
UIGraphicsBeginImageContextWithOptions(tgl.frame.size, tgl.opaque, 0.0);
tgl.renderInContext(UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
image.resizableImageWithCapInsets(UIEdgeInsetsZero)
slider.setMinimumTrackImage(image, forState: .Normal)
//slider.setMaximumTrackImage(image, forState: .Normal)
}
UPDATE for Swift 4.0
func setSlider(slider:UISlider) {
let tgl = CAGradientLayer()
let frame = CGRect.init(x:0, y:0, width:slider.frame.size.width, height:5)
tgl.frame = frame
tgl.colors = [UIColor.blue.cgColor, UIColor.green.cgColor, UIColor.yellow.cgColor, UIColor.orange.cgColor, UIColor.red.cgColor]
tgl.startPoint = CGPoint.init(x:0.0, y:0.5)
tgl.endPoint = CGPoint.init(x:1.0, y:0.5)
UIGraphicsBeginImageContextWithOptions(tgl.frame.size, tgl.isOpaque, 0.0);
tgl.render(in: UIGraphicsGetCurrentContext()!)
if let image = UIGraphicsGetImageFromCurrentImageContext() {
UIGraphicsEndImageContext()
image.resizableImage(withCapInsets: UIEdgeInsets.zero)
slider.setMinimumTrackImage(image, for: .normal)
}
}
Here is possible solution:
Usage:
//array of CGColor objects, color1 and color2 are UIColor objects
NSArray *colors = [NSArray arrayWithObjects:(id)color1.CGColor, (id)color2.CGColor, nil];
//your UISlider
[slider setGradientBackgroundWithColors:colors];
Implementation:
Create category on UISlider:
- (void)setGradientBackgroundWithColors:(NSArray *)colors
{
CAGradientLayer *trackGradientLayer = [CAGradientLayer layer];
CGRect frame = self.frame;
frame.size.height = 5.0; //set the height of slider
trackGradientLayer.frame = frame;
trackGradientLayer.colors = colors;
//setting gradient as horizontal
trackGradientLayer.startPoint = CGPointMake(0.0, 0.5);
trackGradientLayer.endPoint = CGPointMake(1.0, 0.5);
UIImage *trackImage = [[UIImage imageFromLayer:trackGradientLayer] resizableImageWithCapInsets:UIEdgeInsetsZero];
[self setMinimumTrackImage:trackImage forState:UIControlStateNormal];
[self setMaximumTrackImage:trackImage forState:UIControlStateNormal];
}
Where colors is array of CGColor.
I have also created a category on UIImage which creates image from layer as you need an UIImage for setting gradient on slider.
+ (UIImage *)imageFromLayer:(CALayer *)layer
{
UIGraphicsBeginImageContextWithOptions(layer.frame.size, layer.opaque, 0.0);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
For Swift 3 and to prevent the slider from scaling the Min image, apply this when setting the its image. Recalculating the slider's left side is not necessary. Only recalc if you can changing the color of the gradient. The Max image does not seem to scale, but you should probably apply the same setting for consistency. There is a slight difference on the Max image when not applying its insets.
slider.setMinimumTrackImage(image?.resizableImage(withCapInsets:.zero), for: .normal)
For some reason it only works properly when resizableImage(withCapInsets:.zero) is all done at the same time. Running that part separate does not allow the image to work and gets scaled.
Here is the entire routine in Swift 3:
func setSlider(slider:UISlider) {
let tgl = CAGradientLayer()
let frame = CGRect(x: 0.0, y: 0.0, width: slider.bounds.width, height: 5.0 )
tgl.frame = frame
tgl.colors = [ UIColor.yellow.cgColor,UIColor.black.cgColor]
tgl.endPoint = CGPoint(x: 1.0, y: 1.0)
tgl.startPoint = CGPoint(x: 0.0, y: 1.0)
UIGraphicsBeginImageContextWithOptions(tgl.frame.size, false, 0.0)
tgl.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
slider.setMaximumTrackImage(image?.resizableImage(withCapInsets:.zero), for: .normal)
slider.setMinimumTrackImage(image?.resizableImage(withCapInsets:.zero), for: .normal)
}
This is a really effective approach that I've found after a lot of web search. So it's better to share it here as a complete answer. The following code is a Swift Class That you can use to create and use gradients as UIView or UIImage.
import Foundation
import UIKit
class Gradient: UIView{
// Gradient Color Array
private var Colors: [UIColor] = []
// Start And End Points Of Linear Gradient
private var SP: CGPoint = CGPoint.zeroPoint
private var EP: CGPoint = CGPoint.zeroPoint
// Start And End Center Of Radial Gradient
private var SC: CGPoint = CGPoint.zeroPoint
private var EC: CGPoint = CGPoint.zeroPoint
// Start And End Radius Of Radial Gradient
private var SR: CGFloat = 0.0
private var ER: CGFloat = 0.0
// Flag To Specify If The Gradient Is Radial Or Linear
private var flag: Bool = false
// Some Overrided Init Methods
required init(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
override init(frame: CGRect) {
super.init(frame: frame)
}
// Draw Rect Method To Draw The Graphics On The Context
override func drawRect(rect: CGRect) {
// Get Context
let context = UIGraphicsGetCurrentContext()
// Get Color Space
let colorSpace = CGColorSpaceCreateDeviceRGB()
// Create Arrays To Convert The UIColor to CG Color
var colorComponent: [CGColor] = []
var colorLocations: [CGFloat] = []
var i: CGFloat = 0.0
// Add Colors Into The Color Components And Use An Index Variable For Their Location In The Array [The Location Is From 0.0 To 1.0]
for color in Colors {
colorComponent.append(color.CGColor)
colorLocations.append(i)
i += CGFloat(1.0) / CGFloat(self.Colors.count - 1)
}
// Create The Gradient With The Colors And Locations
let gradient: CGGradientRef = CGGradientCreateWithColors(colorSpace, colorComponent, colorLocations)
// Create The Suitable Gradient Based On Desired Type
if flag {
CGContextDrawRadialGradient(context, gradient, SC, SR, EC, ER, 0)
} else {
CGContextDrawLinearGradient(context, gradient, SP, EP, 0)
}
}
// Get The Input Data For Linear Gradient
func CreateLinearGradient(startPoint: CGPoint, endPoint: CGPoint, colors: UIColor...) {
self.Colors = colors
self.SP = startPoint
self.EP = endPoint
self.flag = false
}
// Get The Input Data For Radial Gradient
func CreateRadialGradient(startCenter: CGPoint, startRadius: CGFloat, endCenter: CGPoint, endRadius: CGFloat, colors: UIColor...) {
self.Colors = colors
self.SC = startCenter
self.EC = endCenter
self.SR = startRadius
self.ER = endRadius
self.flag = true
}
// Function To Convert Gradient To UIImage And Return It
func getImage() -> UIImage {
// Begin Image Context
UIGraphicsBeginImageContext(self.bounds.size)
// Draw The Gradient
self.drawRect(self.frame)
// Get Image From The Current Context
let image = UIGraphicsGetImageFromCurrentImageContext()
// End Image Context
UIGraphicsEndImageContext()
// Return The Result Gradient As UIImage
return image
}
}

Swift - port bar graph code from obj-c

SOLUTION: Here is the GitHub link for the class I eventually created / will expand on https://github.com/ckalas/SimpleSwiftBarGraph
I'm trying to (in playground) draw a bar graph using Core Graphics. I'm basing it off this code in Obj-C:
- (void)drawRect:(CGRect)rect {
CGFloat height = self.bounds.size.height;
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context, rect);
CGContextSetFillColorWithColor(context, [UIColor grayColor].CGColor);
CGFloat barWidth = 30;
int count = 0;
for (NSNumber *num in values) {
CGFloat x = count * (barWidth + 10);
CGRect barRect = CGRectMake(x, height - ([num floatValue] * height), barWidth, [num floatValue] * height);
CGContextAddRect(context, barRect);
count++;
}
CGContextFillPath(context);
}
I'm trying to convert this, but at the second line, Xcode doesn't know about UIGraphicsGetCurrentContext(). Can anyone offer help?
EDIT: Here is my current code...issues now are in the for loop where I commented, no matter what combination of stuff I try I cannot get that line to work. Just errors can't convert type to type, I've tried declaring the array as an [NSNumber], no luck.
class CustomView: UIView {
init(frame: CGRect) {
super.init(frame: frame)
}
override func drawRect(rect: GCRect) {
let values: [UInt8] = [1,2,3]
let height = self.bounds.size.height
let context = UIGraphicsGetCurrentContext()
CGContextClearRect(context, self.frame)
CGContextSetFillColorWithColor(context, UIColor.grayColor().CGColor!);
let barWidth = 30.0 as CGFloat
var count = 0 as CGFloat
for number in values {
let x = count * (barWidth + 10) as CGFloat
let barRect = CGRectMake(x, (height - number*height), barWidth, number*height) // issues here can't convert one type to another
CGContextAddRect(context, barRect)
count++
}
CGContextFillPath(context)
}
Declare your values array to be of type [CGFloat]. I made a few other stylistic changes (removed semicolons, CGColor is already implicitly unwrapped so ! is not needed, and prefer type declaration to cast).
class CustomView: UIView {
init(frame: CGRect) {
super.init(frame: frame)
}
override func drawRect(rect: CGRect) {
let values: [CGFloat] = [0.3, 0.6, 0.9]
let height = self.bounds.size.height
let context = UIGraphicsGetCurrentContext()
CGContextClearRect(context, self.frame)
CGContextSetFillColorWithColor(context, UIColor.grayColor().CGColor)
let barWidth:CGFloat = 30.0
var count:CGFloat = 0
for number in values {
let x = count * (barWidth + 10)
let barRect = CGRectMake(x, (height - number*height), barWidth, number*height)
CGContextAddRect(context, barRect)
count++
}
CGContextFillPath(context)
}
}
The values in the values array should range from 0.0 for no bar height to 1.0 for full bar height.

Convert NSColor to RGB

I'm trying to convert an NSColor to RGB, but it seems to give an entirely incorrect result:
NSColor *testColor = [NSColor colorWithCalibratedWhite:0.65 alpha:1.0];
const CGFloat* components = CGColorGetComponents(testColor.CGColor);
NSLog(#"Red: %f", components[0]);
NSLog(#"Green: %f", components[1]);
NSLog(#"Blue: %f", components[2]);
NSLog(#"Alpha: %f", CGColorGetAlpha(testColor.CGColor));
I get back : red = 0.65 - green = 1.0 - blue = 0.0 and alpha is 1.0 - which results in an entirely different color. (It should be gray, now it's green).
Am I doing something wrong?
Extracting RGBA values from NSColor: (Swift 3)
let nsColor: NSColor = .red
let ciColor: CIColor = .init(color: nsColor)!
print(ciColor.red) // 1.0
print(ciColor.green) // 0.0
print(ciColor.blue) // 0.0
print(ciColor.alpha) // 1.0 // Or use nsColor.alphaComponent
NOTE: NSColor.blackColor().redComponent will crash the app, but the above code won't
I had the same problem when I wanted to convert a picked color to hexadecimal. NSColor components values was not correct. I managed to resolve my problem with your comment above.
Example in Swift:
let colorTest = NSColor.init(calibratedWhite: 0.65, alpha: 1.0)
let color = colorTest.usingColorSpace(NSColorSpace.deviceRGB) ?? colorTest
print(colorTest)
// NSCalibratedWhiteColorSpace 0.65 1
print(colorTest.colorSpace)
// Generic Gray colorspace
print("red: \(color.redComponent) green:\(color.greenComponent) blue:\(color.blueComponent)")
// red: 0.708725869655609 green:0.708725869655609 blue:0.708725869655609
You need to convert the color to an RGB color space using an NSColorSpace object first, then you can get the components using the various NSColor accessor methods
For a NSColor * color
CGFloat red = [color redComponent];
CGFloat green = [color greenComponent];
CGFloat blue = [color blueComponent];
I have used this in the past, and it worked for me.
NSColorSpace *colorSpace = [NSColorSpace sRGBColorSpace];
NSColor *testColor = [NSColor colorWithColorSpace:colorSpace components:SRGB];
CGFloat red = [testColor redComponent];
CGFloat green = [testColor greenComponent];
CGFloat blue = [testColor blueComponent];
You have to check the colorspace first
then if it's rgb you can use
CGFloat red = [testColor redComponent];
...
For grayscale you have to convert it differently
CGFloat red = [testColor whiteComponent];
CGFloat blue = [testColor whiteComponent];
CGFloat green = [testColor whiteComponent];
Here’s a safe Swift 5 SKColor extension for getting the RGB components of an NSColor or UIColor. (Note SKColor is just a typealias for one or the other based on the platform.)
public extension SKColor {
var sRGBAComponents: (red: CGFloat , green: CGFloat, blue: CGFloat, alpha: CGFloat) {
#if os(iOS)
let rgbColor = self // lolz no color conversion on iOS, but on iOS it'll respond to getRed(...) anyhow
#elseif os(macOS)
let rgbColor = usingColorSpace(.extendedSRGB) ?? SKColor(red: 1, green: 1, blue: 1, alpha: 1) // will return 'self' if already RGB
#endif
var red: CGFloat = 0, green: CGFloat = 0, blue: CGFloat = 0, alpha: CGFloat = 0
rgbColor.getRed(&red, green: &green, blue: &blue, alpha: &alpha)
return (red: red, green: green, blue: blue, alpha: alpha)
}
}

How can I draw an arrow using Core Graphics?

I need to draw line with arrow on its end in my Draw app. I'm not good in trigonometry, so can't solve this problem.
The user put his finger on the screen and draw the line in any direction.
So, the arrow should appear on the line end.
UPDATE
I've posted a Swift version of this answer separately.
ORIGINAL
This is a fun little problem. First of all, there are lots of ways to draw arrows, with curved or straight sides. Let's pick a very simple way and label the measurements we'll need:
We want to write a function that takes the start point, the end point, the tail width, the head width, and the head length, and returns a path outlining the arrow shape. Let's create a category named dqd_arrowhead to add this method to UIBezierPath:
// UIBezierPath+dqd_arrowhead.h
#interface UIBezierPath (dqd_arrowhead)
+ (UIBezierPath *)dqd_bezierPathWithArrowFromPoint:(CGPoint)startPoint
toPoint:(CGPoint)endPoint
tailWidth:(CGFloat)tailWidth
headWidth:(CGFloat)headWidth
headLength:(CGFloat)headLength;
#end
Since there are seven corners on the path of the arrow, let's start our implementation by naming that constant:
// UIBezierPath+dqd_arrowhead.m
#import "UIBezierPath+dqd_arrowhead.h"
#define kArrowPointCount 7
#implementation UIBezierPath (dqd_arrowhead)
+ (UIBezierPath *)dqd_bezierPathWithArrowFromPoint:(CGPoint)startPoint
toPoint:(CGPoint)endPoint
tailWidth:(CGFloat)tailWidth
headWidth:(CGFloat)headWidth
headLength:(CGFloat)headLength {
OK, the easy part is done. Now, how do we find the coordinates of those seven points on the path? It is much easier to find the points if the arrow is aligned along the X axis:
It's pretty easy to compute the point coordinates on an axis-aligned arrow, but we'll need the overall length of the arrow to do it. We'll use the hypotf function from the standard library:
CGFloat length = hypotf(endPoint.x - startPoint.x, endPoint.y - startPoint.y);
We'll call on a helper method to actually compute the seven points:
CGPoint points[kArrowPointCount];
[self dqd_getAxisAlignedArrowPoints:points
forLength:length
tailWidth:tailWidth
headWidth:headWidth
headLength:headLength];
But we need to transform those points, because in general we're not trying to create an axis-aligned arrow. Fortunately, Core Graphics supports a kind of transformation called an affine transformation, which lets us rotate and translate (slide) points. We'll call another helper method to create the transform that turns our axis-aligned arrow into the arrow we were asked for:
CGAffineTransform transform = [self dqd_transformForStartPoint:startPoint
endPoint:endPoint
length:length];
Now we can create a Core Graphics path using the points of the axis-aligned arrow and the transform that turns it into the arrow we want:
CGMutablePathRef cgPath = CGPathCreateMutable();
CGPathAddLines(cgPath, &transform, points, sizeof points / sizeof *points);
CGPathCloseSubpath(cgPath);
Finally, we can wrap a UIBezierPath around the CGPath and return it:
UIBezierPath *uiPath = [UIBezierPath bezierPathWithCGPath:cgPath];
CGPathRelease(cgPath);
return uiPath;
}
Here's the helper method that computes the point coordinates. It's quite simple. Refer back to the diagram of the axis-aligned arrow if you need to.
+ (void)dqd_getAxisAlignedArrowPoints:(CGPoint[kArrowPointCount])points
forLength:(CGFloat)length
tailWidth:(CGFloat)tailWidth
headWidth:(CGFloat)headWidth
headLength:(CGFloat)headLength {
CGFloat tailLength = length - headLength;
points[0] = CGPointMake(0, tailWidth / 2);
points[1] = CGPointMake(tailLength, tailWidth / 2);
points[2] = CGPointMake(tailLength, headWidth / 2);
points[3] = CGPointMake(length, 0);
points[4] = CGPointMake(tailLength, -headWidth / 2);
points[5] = CGPointMake(tailLength, -tailWidth / 2);
points[6] = CGPointMake(0, -tailWidth / 2);
}
Computing the affine transform is more complicated. This is where the trigonometry comes in. You could use atan2 and the CGAffineTransformRotate and CGAffineTransformTranslate functions to create it, but if you remember enough trigonometry, you can create it directly. Consult “The Math Behind the Matrices” in the Quartz 2D Programming Guide for more information about what I'm doing here:
+ (CGAffineTransform)dqd_transformForStartPoint:(CGPoint)startPoint
endPoint:(CGPoint)endPoint
length:(CGFloat)length {
CGFloat cosine = (endPoint.x - startPoint.x) / length;
CGFloat sine = (endPoint.y - startPoint.y) / length;
return (CGAffineTransform){ cosine, sine, -sine, cosine, startPoint.x, startPoint.y };
}
#end
I have put all of the code in a gist for easy copy'n'paste.
With this category, you can easily draw arrows:
Since you're just generating a path, you can choose not to fill it, or not to stroke it as in this example:
You have to be careful, though. This code doesn't prevent you from getting funky results if you make the head width less than the tail width, or if you make the head length larger than the total arrow length:
Here's a Swift version of my old Objective-C code. It should work in Swift 3.2 and later versions.
extension UIBezierPath {
static func arrow(from start: CGPoint, to end: CGPoint, tailWidth: CGFloat, headWidth: CGFloat, headLength: CGFloat) -> UIBezierPath {
let length = hypot(end.x - start.x, end.y - start.y)
let tailLength = length - headLength
func p(_ x: CGFloat, _ y: CGFloat) -> CGPoint { return CGPoint(x: x, y: y) }
let points: [CGPoint] = [
p(0, tailWidth / 2),
p(tailLength, tailWidth / 2),
p(tailLength, headWidth / 2),
p(length, 0),
p(tailLength, -headWidth / 2),
p(tailLength, -tailWidth / 2),
p(0, -tailWidth / 2)
]
let cosine = (end.x - start.x) / length
let sine = (end.y - start.y) / length
let transform = CGAffineTransform(a: cosine, b: sine, c: -sine, d: cosine, tx: start.x, ty: start.y)
let path = CGMutablePath()
path.addLines(between: points, transform: transform)
path.closeSubpath()
return self.init(cgPath: path)
}
}
Here's an example of how you'd call it:
let arrow = UIBezierPath.arrow(from: CGPoint(x: 50, y: 100), to: CGPoint(x: 200, y: 50),
tailWidth: 10, headWidth: 25, headLength: 40)
//This is the integration into the view of the previous exemple
//Attach the following class to your view in the xib file
#import <UIKit/UIKit.h>
#interface Arrow : UIView
#end
#import "Arrow.h"
#import "UIBezierPath+dqd_arrowhead.h"
#implementation Arrow
{
CGPoint startPoint;
CGPoint endPoint;
CGFloat tailWidth;
CGFloat headWidth;
CGFloat headLength;
UIBezierPath *path;
}
- (id)initWithCoder:(NSCoder *)aDecoder
{
if (self = [super initWithCoder:aDecoder])
{
[self setMultipleTouchEnabled:NO];
[self setBackgroundColor:[UIColor whiteColor]];
}
return self;
}
- (void)drawRect:(CGRect)rect {
[[UIColor redColor] setStroke];
tailWidth = 4;
headWidth = 8;
headLength = 8;
path = [UIBezierPath dqd_bezierPathWithArrowFromPoint:(CGPoint)startPoint
toPoint:(CGPoint)endPoint
tailWidth:(CGFloat)tailWidth
headWidth:(CGFloat)headWidth
headLength:(CGFloat)headLength];
[path setLineWidth:2.0];
[path stroke];
}
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touchPoint = [touches anyObject];
startPoint = [touchPoint locationInView:self];
endPoint = [touchPoint locationInView:self];
[self setNeedsDisplay];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
endPoint=[touch locationInView:self];
[self setNeedsDisplay];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
endPoint = [touch locationInView:self];
[self setNeedsDisplay];
}
#end
In Swift 3.0 you can achieve this with
extension UIBezierPath {
class func arrow(from start: CGPoint, to end: CGPoint, tailWidth: CGFloat, headWidth: CGFloat, headLength: CGFloat) -> Self {
let length = hypot(end.x - start.x, end.y - start.y)
let tailLength = length - headLength
func p(_ x: CGFloat, _ y: CGFloat) -> CGPoint { return CGPoint(x: x, y: y) }
var points: [CGPoint] = [
p(0, tailWidth / 2),
p(tailLength, tailWidth / 2),
p(tailLength, headWidth / 2),
p(length, 0),
p(tailLength, -headWidth / 2),
p(tailLength, -tailWidth / 2),
p(0, -tailWidth / 2)
]
let cosine = (end.x - start.x) / length
let sine = (end.y - start.y) / length
var transform = CGAffineTransform(a: cosine, b: sine, c: -sine, d: cosine, tx: start.x, ty: start.y)
let path = CGMutablePath()
path.addLines(between: points, transform: transform)
path.closeSubpath()
return self.init(cgPath: path)
}
}