How to integrate Chartboost IOS SDK with Swift - objective-c

I am trying to integrate the Chartboost IOS SDK with Swift. I have followed all the instructions on the Chartboost IOS Integration page https://answers.chartboost.com/hc/en-us/articles/201220095-iOS-Integration and have created a bridging header to use the framework in my swift project.
BridgingHeader.h
#import <Chartboost/Chartboost.h>
#import <Chartboost/CBNewsfeed.h>
#import <CommonCrypto/CommonDigest.h>
#import <AdSupport/AdSupport.h>
My BridgingHeader.h file is loacated in my project root directory http://i.imgur.com/DcTcixo.png and I have followed the necessary steps to add the BridgingHeader.h to my Build Settings http://i.imgur.com/jvtzs7a.png but when I run the project I get 52 errors like -> http://i.imgur.com/WCvyooz.png. Why am I getting these errors and how do I get rid of them?

It looks like the headers require UIKit and Foundation. Add this line at the top of your bridging header.
#import <UIKit/UIKit.h>
Also, have you made sure your bridging header is in the project's root in the file system? The hierarchy of Xcode's Project Navigator isn't necessarily the same as the file system.
It's looking for the header in /Users/andrew/Documents/dev/ios/Protect Paigridge/ Open finder and make sure that the header is in that directory. Xcode may have created it a level deeper where the rest of your code files are. If that's the case, you can edit entry in Build Settings or move the file.

Adding onto the top answer said above try adding both
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#import <Chartboost/Chartboost.h>
to your BridgingHeader.h File
and make sure they are both above the
#import <Chartboost/Chartboost.h>
as shown above

Swift Chartboost Code, Add it in AppDelegate Class :
func applicationDidBecomeActive(application: UIApplication)
{
let kChartboostAppID = "5472ef8f04b01601a1a5814c";
let kChartboostAppSignature = "5b6222426e68cda48669a1d4d8246d4c3d20db9c";
Chartboost.startWithAppId(kChartboostAppID, appSignature: kChartboostAppSignature, delegate: self);
Chartboost.cacheMoreApps(CBLocationHomeScreen)
}
class func showChartboostAds()
{
Chartboost.showInterstitial(CBLocationHomeScreen);
}
func didFailToLoadInterstitial(location :String!, withError error: CBLoadError)
{
}
func didDismissInterstitial(location :String! )
{
if(location == CBLocationHomeScreen)
{
Chartboost.cacheInterstitial(CBLocationMainMenu)
}
else if(location == CBLocationMainMenu)
{
Chartboost.cacheInterstitial(CBLocationGameOver)
}
else if(location == CBLocationGameOver)
{
Chartboost.cacheInterstitial(CBLocationLevelComplete)
}
else if(location == CBLocationLevelComplete)
{
Chartboost.cacheInterstitial(CBLocationHomeScreen)
}
}
Calling Syntax:
AppDelegate.showChartboostAds()

This issue is how to import.
You can make objC code and bridge.h, to shows in your code
[Chartboost showInterstitial:CBLocationHomeScreen];
For swift
Chartboost.showInterstitial(CBLocationHomeScreen)
i wish that help you

Related

How to show Airplay panel programmatically on iOS using flutter

The desired output looks like this,
how to show this on click event programmatically using flutter(even with native code if possible), it's really appreciated if anyone could show an example.
If there is no direct approach to this then a platform specific example using MethodChannel is also very welcome. Native code example must be in Objective C.
Additionally I have tried to use flutter_to_airplay but project fails to run and also has other functionalities that are not needed in this context, what is needed is showing Airplay panel only.
(Answer by M123 native code completely not working)
Here is a example how to open the AirPlayPanel in objective c.
Setup flutter
First you have to create a channel. To start communicating to the native code.
All channel names used in a single app must be unique; prefix the
channel name with a unique ‘domain prefix
static const platform = const MethodChannel('stack.M123.dev/airplay');
Then you have to invoke a method on the method channel.
Future<void> _openAirPlay() async {
try {
await platform.invokeMethod('openAirPlay');
} on PlatformException catch (e) {
print('error');
}
}
Native part
Now the flutter part is done.
First you have to add suppport for swift. For this open the ios folder in the flutter root with XCode.
Expand Runner > Runner in the Project navigator.
Open the AppDelegate.m located under Runner > Runner in the Project navigator.
Create a FlutterMethodChannel and add a handler inside the application didFinishLaunchingWithOptions: method. Make sure to use the same channel name as was used on the Flutter client side.
#import <Flutter/Flutter.h>
#import "GeneratedPluginRegistrant.h"
#implementation AppDelegate
- (BOOL)application:(UIApplication*)application didFinishLaunchingWithOptions:(NSDictionary*)launchOptions {
FlutterViewController* controller = (FlutterViewController*)self.window.rootViewController;
FlutterMethodChannel* airPlayChannel= [FlutterMethodChannel
methodChannelWithName:#"stack.M123.dev/airplay"
binaryMessenger:controller.binaryMessenger];
[airPlayChannel setMethodCallHandler:^(FlutterMethodCall* call, FlutterResult result) {
// Note: this method is invoked on the UI thread.
if ([#"getBatteryLevel" isEqualToString:call.method]) {
int returnValue = [weakSelf openAirPlay];
result(#(returnValue));
}];
[GeneratedPluginRegistrant registerWithRegistry:self];
return [super application:application didFinishLaunchingWithOptions:launchOptions];
}
Add the method in the AppDelegate class, just before #end
- (int)openAirPlay {
//Open air play
return (int)(100);
}
Discalimer: I am not a IOS developer so the steps are mostly theoretical.
I am following the official guide from Flutter. It can be found in full length here.

SwiftUI's Canvas won't appear in Objective-C project

I'm trying to use SwiftUI in my Objective-C project, so I use this following step:
this article
after that, I'm success bridging this 2 files to my Objective-C project. but in my file SwiftUIFile.swift, my canvas cannot be resume. when I try to resume, nothing happened.
here's the code
import SwiftUI
#available (iOS 13.0, *)
struct DetailInterfacesUISwift: View {
var labelName = ""
var body: some View {
Text(labelName)
}
}
#available (iOS 13.0, *)
struct DetailInterfacesUISwift_Previews: PreviewProvider {
static var previews: some View {
DetailInterfacesUISwift()
}
}
note
1. I already delete script in build phases, nothing happen
2. I already clean project and delete core simulator downloaded
3. And still not working.

Import Objective-C vendored framework from parent pod using Swift (for a mixed project)

I have a podspec setup as follows:
Pod::Spec.new do |spec|
spec.name = "TMGAdvertising"
spec.default_subspecs = ["Core"]
*** [extra stuff removed] ***
spec.subspec 'Core' do |tmgadvertising|
*** [extra stuff removed] ***
end
spec.subspec 'Inneractive' do |inneractive|
inneractive.dependency "TMGAdvertising/Core"
inneractive.private_header_files = "TMGAdvertising/AdNetworkSupport/Inneractive/SDK/*.h", "TMGAdvertising/AdNetworkSupport/Inneractive/Adapters/*.h"
inneractive.public_header_files = "TMGAdvertising/AdNetworkSupport/Inneractive/InneractiveWrapper.h"
inneractive.source_files = ["TMGAdvertising/AdNetworkSupport/Inneractive/Adapters/*.{h,m,swift}", "TMGAdvertising/AdNetworkSupport/Inneractive/SDK/*.{h,m,swift}", "TMGAdvertising/AdNetworkSupport/Inneractive/InneractiveWrapper.{h,m,swift}"]
inneractive.vendored_frameworks = "TMGAdvertising/AdNetworkSupport/Inneractive/SDK/*.framework"
inneractive.pod_target_xcconfig = { 'OTHER_LDFLAGS' => ['-ObjC'] }
end
end
I have no problem writing wrapper classes for the Inneractive framework in Obj-C (housed within TMGAdvertising). Here's one of my example Obj-C wrappers:
#import "InneractiveWrapper.h"
#import IASDKCore;
#implementation InneractiveWrapper
+ (void)initializeSDK:(NSString *)appId {
[[IASDKCore sharedInstance] initWithAppID:appId];
}
#end
The problem is that I don't want to be writing my wrappers in Objective-C -- I'd prefer to write them in Swift.
Normally I could use a bridging header to accomplish this (that's how it was setup before when integrated directly into the app) but since this is a subspec my understanding is that's not possible.
My question is: is there any way I can directly import this vendored Inneractive framework in a Swift file located in the TMGAdvertising pod?
As far as I know you should be able to import the file in your Bridging Header.
#import IASDKCore;

How to encode image (asset) and via photo picker to base64 in React Native (iOS)?

I've been struggling to find a supported solution to encode images (from assets) and photo picker to base64string.
I can do this via Swift in a straight native app.
func convertImageTobase64(format: ImageFormat, image:UIImage) -> String? {
var imageData: Data?
switch format {
case .png: imageData = image.pngData()
case .jpeg(let compression): imageData = image.jpegData(compressionQuality: compression)
}
return imageData?.base64EncodedString()
}
var mylogo: UIImage? = UIImage.init(named: "DFU-180x180")
let base64String = convertImageTobase64(format: .png, image: mylogo!)
let dataString = "data:image/jpg;base64," + base64String!
I tried to do this via NativeModules, but I get errors for RCTConvert being run on background thread instead of main.
Images.h
#import <Foundation/Foundation.h>
#import <React/RCTBridgeModule.h>
#interface Images : NSObject <RCTBridgeModule>
#end
Images.m
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#import "Images.h"
#implementation Images
RCT_EXPORT_MODULE()
// All the methods are implemented in a Swift extension, see FileBridgeExtension.swift
RCT_EXTERN_METHOD(convertImageTobase64:(nonnull NSString*)format image:(nonnull UIImage*)image callback:(RCTResponseSenderBlock))
#end
ImagesExtension.swift
import UIKit
public enum ImageFormat {
case png
case jpeg(CGFloat)
}
#objc extension Images {
#objc func convertImageTobase64(_ format: NSString, image:UIImage, callback: #escaping ([Any]?)->Void) {
var imageData: Data?
print("convertImageTobase64_line 1")
print("convert format: " + (format as! String))
switch format {
case ".png": imageData = UIImagePNGRepresentation(image)
print("convertImageTobase64_line 2")
case ".jpeg": imageData = UIImageJPEGRepresentation(image, 1.0)
print("convertImageTobase64_line 3")
default:
print("convertImageTobase64_line 4")
let error = RCTMakeError("Invalid image format", nil, nil)
callback([[error], []]);
}
let base64string = imageData?.base64EncodedString()
print("convertImageTobase64_line 5 = " + base64string!)
callback([[NSNull()], [base64string]]);
}
}
I've tried 4 different React Native libraries and nothing works. I get errors that the library doesn't exist, even thought I do the npm install and confirm the library exists in node_modules. I even remove the node_modules folder, and rebuild it with npm install.
2 of the libraries that I've tried.
npm version that I'm using is: 6.4.1
node version that I'm using is: 8.12.0
Xcode v10
react-native-image-base64
react-native-image-to-base64
You can use this package rn-fetch-blob, it has this function File Stream which will serve your purpose.
Hope this helps you.
If you want further assistance, do ping me by commenting this post.

fatal error: objc/Object.h: No such file or directory|

I followed the tutorial on how to setup Objective C with code:block (of course I setup gnu step also http://www.gnustep.org/experience/Windows.html) http://wiki.codeblocks.org/index.php?title=Installing_Objective-C_Compiler
I tried with a simple hello world It works. But when I tried to compile their sample code files which contains
#import <objc/Object.h>
#interface TestObject : Object
{
int internalInt;
}
- (int)add:(int)anInt;
- (int)subtract:(int)anInt;
- (int)value;
#end
I got this error
fatal error: objc/Object.h: No such file or directory|
So what should I add to Code:Block settings to really make it work ?