adding cloudinary to strapi - cloudinary

Can someone tell me how to install Cloudinary to my Strapi app, I installed the plugin like the documentation said but the plugin doesn't show up at all in my project. Can someone tell me what im doing wrong

There is an example on the strapi documentation:
https://strapi.io/documentation/3.0.0-beta.x/plugins/upload.html#using-a-provider
To enable the provider for Cloudinary, create or edit the file at ./extensions/upload/config/settings.json
{
"provider": "cloudinary",
"providerOptions": { "cloud_name":"PROVIDER_CLOUD_NAME",
"api_key": "PROVIDER_API_KEY",
"api_secret":"PROVIDER_API_SECRET"
}
}
Of course you should replace PROVIDER_CLOUD_NAME, PROVIDER_API_KEY, PROVIDER_API_SECRET with appropriate values that can be found on your Cloudinary account.
If you want a specific configuration by environment you can edit the file at ./extensions/upload/config/settings.js like this:
if (process.env.NODE_ENV === 'production') {
module.exports = {
provider: 'providerName',
providerOptions: {
cloud_name: process.env.PROVIDER_CLOUD_NAME,
api_key: process.env.PROVIDER_API_KEY,
api_secret: process.env.PROVIDER_API_SECRET
}
};
} else {
// to use the default local provider you can return an empty configuration
module.exports = {};
}

Related

Self Signed Cert error in Nuxt trying to generate static site locally

I'm building a Vue/Nuxt (2) site running against a .NET Web Api. This site is already deployed in a staging capacity and is building and running as a statically generated site on Netlify. Now, I know it's not quite right as my content is not being rendered into the deployed files so effectively it's running as a SPA. Not quite what I saw happening in Dev at the time 5 weeks ago but I didn't think anything of it, I'd fix it later.
I've finally got a chance to work on this project again and proceeded to make the necessary changes so the content should be fetched via my dynamic route builder in nuxt.config.js (existing) and output during build via the asyncData hook in my pages (new).
Nuxt.config
// Generate dynamic page routes
let dynamicRoutes = async () => {
console.log( `${ process.env.API_BASE_URL }/page/main/generate` );
const fetchedConditions = await axios.get( `${ process.env.API_BASE_URL }/page/main/generate` );
const routesForConditions = fetchedConditions.data.map( ( condition ) => {
return {
route: `/conditions/${ condition.id }/${ condition.urlPath }`,
payload: condition
}
} );
console.log( `${ process.env.API_BASE_URL }/faq/top/generate?count=10` );
const fetchedFaqs = await axios.get( `${ process.env.API_BASE_URL }/faq/top/generate?count=10` );
const routesForFaqs = fetchedFaqs.data.map( ( faq ) => {
return {
route: `/frequently-asked-questions/${ faq.categoryId }/${ faq.id }/${ faq.urlPath }`,
payload: faq
}
} );
const routes = [ ...routesForConditions, ...routesForFaqs ];
return routes;
}
export default {
target: 'static',
ssr: false,
generate: {
crawler: true,
routes: dynamicRoutes
},
server: {
port: 3001
}...
Condition page
async asyncData(ctx) {
util.debug('Async data call...');
if (ctx.payload) {
ctx.store.dispatch("pages/storePage", ctx.payload);
return { condition: ctx.payload };
} else {
const pageResponse = await ctx.store.dispatch('pages/getCurrentPage', { pageId: ctx.route.params.id });
return { condition: pageResponse };
}
}
So far so good except now, when I try to generate the site in development i.e. "npm run generate", the dynamic route generator code cannot reach my local API running as HTTPS and fails with a "Nuxt Fatal Error: self signed certificate".
https://localhost:5001/api/page/main/generate 12:06:43
ERROR self signed certificate 12:06:43
at TLSSocket.onConnectSecure (node:_tls_wrap:1530:34)
at TLSSocket.emit (node:events:390:28)
at TLSSocket._finishInit (node:_tls_wrap:944:8)
at TLSWrap.ssl.onhandshakedone (node:_tls_wrap:725:12)
This worked 5 weeks ago and as far as I am aware I have not changed anything that should impact this. No software or packages have been updated (except windows updates perhaps). The API is still using .NET 5.0 and running on Kestrel using the default self signed cert on localhost (which is listed as valid in Windows). I simply added the payload to the routes, added the asyncData hook, and modified the page code accordingly.
I've Googled a few tidbits up but none have resolved the issue and now I'm at a loss. It really shouldn't be this blimmin opaque in 2022.
Tried disabling SSL via a proxy in nuxt.config;
proxy: {
'/api/': {
target: process.env.API_BASE_URL,
secure: !process.env.ENV === 'development'
}
}
Tried modifying my Axios plugin to ignore auth;
import https from 'https';
export default function ( { $axios } ) {
$axios.defaults.httpsAgent = new https.Agent( { rejectUnauthorized: false } );
}
And a variation of;
import https from 'https';
export default function ( { $axios, store } ) {
const agent = new https.Agent( {
rejectUnauthorized: false
} );
$axios.onRequest( config => {
if ( process.env.dev )
{
config.httpsAgent = agent;
}
} );
}
None of these 'worked for other people' solutions is working for me.
Also, the client side apps (public/admin) themselves have no problem working against my API locally, it's only the route builder within nuxt.config or asyncData code which is throwing this error.
Any suggestions would be appreciated. Happy to add other relevant code if needed, just not sure which atm.

Expo App environments for Dev, UAT and Production

I have a React Native app built in Expo that connects to a Rest API. There are three environments for the rest api - dev, uat and production as below (example).
dev = https://dev.myapi.com/api
uat = https://uat.myapi.com/api
prod = https://prod.myapi.com/api
Depending on where the app is being used it needs to connect to the correct environment.
Running in the Expo Client = Dev API
Running in TestFlight or Internal Testing for the Play Store = UAT API
Running in the App Store or Play Store = Production API
What is the simplest way to achieve this?
Follow below Steps
Install expo-constants package. To install the package run the below command.
npm i expo-constants
Add environment.js file and paste below code.
import Constants from 'expo-constants';
import { Platform } from 'react-native';
const localhost = Platform.OS === 'ios' ? 'localhost:8080' : '10.0.2.2:8080';
const ENV = {
dev: {
apiUrl: 'https://dev.myapi.com/api',
amplitudeApiKey: null,
},
staging: {
apiUrl: 'https://uat.myapi.com/api',
amplitudeApiKey: '[Enter your key here]',
// Add other keys you want here
},
prod: {
apiUrl: 'https://prod.myapi.com/api',
amplitudeApiKey: '[Enter your key here]',
// Add other keys you want here
},
};
const getEnvVars = (env = Constants.manifest.releaseChannel) => {
// What is __DEV__ ?
// This variable is set to true when react-native is running in Dev mode.
// __DEV__ is true when run locally, but false when published.
if (__DEV__) {
return ENV.dev;
} else if (env === 'staging') {
return ENV.staging;
} else if (env === 'prod') {
return ENV.prod;
}
};
export default getEnvVars;
Accessing Environment Variables
// Import getEnvVars() from environment.js
import getEnvVars from '../environment';
const { apiUrl } = getEnvVars();
/******* SESSIONS::LOG IN *******/
// LOG IN
// credentials should be an object containing phone number:
// {
// "phone" : "9876342222"
// }
export const logIn = (credentials, jsonWebToken) =>
fetch(`${apiUrl}/phone`, {
method: 'POST',
headers: {
Authorization: 'Bearer ' + jsonWebToken,
'Content-Type': 'application/json',
},
body: JSON.stringify(credentials),
});
To create the builds use the below commands.
Dev - expo build:ios --release-channel dev
Staging - expo build:ios --release-channel staging
Production - expo build:ios --release-channel prod
Now that Expo supports config file as app.config.js or app.config.ts, we can use the dotenv. Check this: https://docs.expo.io/guides/environment-variables/#using-a-dotenv-file
Refer link
This can be done using different Release Channel names,
lets say you have created 3 release channels this way:
expo publish --release-channel prod
expo publish --release-channel staging
expo publish --release-channel dev
then you can have a function to set environment vars accordingly:
import * as Updates from 'expo-updates';
function getEnvironment() {
if (Updates.releaseChannel.startsWith('prod')) {
// matches prod*
return { envName: 'PRODUCTION', dbUrl: 'ccc', apiKey: 'ddd' }; // prod env settings
} else if (Updates.releaseChannel.startsWith('staging')) {
// matches staging*
return { envName: 'STAGING', dbUrl: 'eee', apiKey: 'fff' }; // stage env settings
} else {
// assume any other release channel is development
return { envName: 'DEVELOPMENT', dbUrl: 'aaa', apiKey: 'bbb' }; // dev env settings
}
}
Refer to expo documentation for more info!
For those who are using Expo sdk 46(or any newer version), you can do the following way
Rename the app.json to app.config.js
Add API URL under extra property
export default () => ({
expo: {
name: '',
slug: ''
extra: {
API_URL: process.env.API_URL || null,
},
// ...
},
});
We can access this API using expo constants like this(wherever we want).
Don't forget to import constants from Expo.
const myApi = Constants.expoConfig.extra.API_URL
axios.get(myApi).... // using API END POINT
For Local development to access API you can do it in two ways
API_URL="http:// localhost:3000" expo start
Just comment the Contants.expoConfig..... and directly paste local URL
like const myApi = "http:// localhost:3000"
And in eas.json
{
"production": {
"env": {
"API_URL": "https://prod.example.com"
}
},
"staging": {
"env": {
"API_URL": "https://staging.example.com"
}
}
}
Once we run eas build the appropriate API endpoint will be set.
Refer to the same in Expo documentation
https://docs.expo.dev/eas-update/environment-variables/

Implementing express graphql api with Gatsby graphql

So I am building an app using express js ,graphql, postgres and react. And I have already build my backend but now instead of using react, I want to use GatsbyJs how do I connect my express graphiql with my Gatsby graphiql or send my data directly to Gatsby graphiql
Add the following in gatsby-config.js
module.exports = {
plugins: [
{
resolve: "gatsby-source-graphql",
options: {
// This type will contain remote schema Query type
typeName: "MyGraph",
// This is the field under which it's accessible
fieldName: "myGraph",
// URL to query from
url: "http://localhost:4000/graphql",
},
},
],
}
Then in say index.js make the query
export const query = graphql
query {
myGraph {
users {
name
age
}
}
}

Unable to resolve module tensorflow_inception_graph.pb as a file nor as a folder

Here I am using tensor flow with react native using react-native-tensorflow llibrary. The library has installed properly. The code snippet which I am using and facing an issue is
const tfImageRecognition = new TfImageRecognition({
model: require('./assets/tensorflow_inception_graph.pb'),
labels: require('./assets/tensorflow_labels.txt'),
imageMean: 117, // Optional, defaults to 117
imageStd: 1 // Optional, defaults to 1 })
In the model property, when I am loading the tensorflow_inception_graph.pb file then it is giving me the error
error: bundling failed: UnableToResolveError: Unable to resolve module
`../asset/tensorflow_inception_graph.pb` from
`/Users/XYZ/App/code/Demo/src/ImageRecognitionAI.js`:
could not resolve `/Users/XYZ/App/code/Demo/src/assets/tensorflow_inception_graph.pb'
as a file nor as a folder
The file path which I am passing in model is checked and found correct. Can anyone help me to get out of this? Help will be appreciated.
place the tenserflow_labels.text and tensorflow_inception_graph.pb and file in the assets folder
=> android/app/src/main/assets/tensorflow_inception_graph.pb
=> android/app/src/main/assets/tenserflow_labels.text
now you can access it like this in your js file.
const tf = new TfImageRecognition({
model: 'file://tensorflow_inception_graph.pb',
labels: 'file://tenserflow_labels.txt'
});
it worked for me.
You have to specify the webpack type extensions in either the package or a rn-cli.config.js file. If you're using create-react-native-app then you want to add it to the app.json file like this:
{
"expo": {
"sdkVersion": "27.0.0",
"packagerOpts": {
"assetExts": ["pb", "txt"]
}
}
}
I didn't find that in the documentation for some reason, but I found it in some example projects.
If you're running your scripts with react-native start then you need to setup a rn-cli.config.js file. Here is the documentation
module.exports = {
getAssetExts() {
return ['pb', 'txt']
}
}
If you are running scripts from rn-cli.config.js
change file content to :
const { getDefaultConfig } = require("metro-config");
module.exports = (async () => {
const {
resolver: { assetExts }
} = await getDefaultConfig();
return {
resolver: {
assetExts: [...assetExts, "pb", "txt"]
}
};
})();

Can I set authorization headers with RequireJS?

We want to have 2 sets of resources for our AngularJS app (public/private) which uses RequireJS for dependency management. Basically everything on the login page would be public and once logged in, another angularjs app would be loaded (new requirejs config) that would load resources that require authentication to access.
Is there a way to configure requirejs to set an authorization header when loading resources?
It depends on what you mean by "resources" and how your server is configured. But in general - yes, since you are using AngularJS you can use the $httpProvider to inject an interceptor service.
For example, in a service:
var dependencies = ['$rootScope', 'userService'];
var service = function ($rootScope, userService) {
return {
request: function(config) {
var currentUser = userService.getCurrentUser();
var access_token = currentUser ? currentUser.access_token : null;
if(access_token) {
config.headers.authorization = access_token;
}
return config;
},
responseError: function (response) {
if(response.status === 401) {
$rootScope.$broadcast('unauthorized');
}
return response;
}
};
};
module.factory(name, dependencies.concat(service));
Then, after you configure your routes, you can use:
$httpProvider.interceptors.push( 'someService');
You can find some more information on interceptors here: https://docs.angularjs.org/api/ng/service/$http#interceptors
UPDATE
You might be able to use the text plugin to try and receive it, but I don't see the point in protecting client side code. Plus, if you want to use optimization the resources will just come in one file anyway...
config: {
text: {
onXhr: function (xhr, url) {
xhr.setRequestHeader('Authorization','Basic ' + token);
}
}
}
Refer to: custom-xhr-hooks
Another UPDATE
You could also use urlArgs (mainly used for cache invalidation) without using the text plugin:
require.config({
urlArgs: 'token='+token,
...
)}