How do I access an asset's URL when linked via reference in Sanity Studio? - sanity

I want to upload PDFs in Sanity Studio, then link to those PDFs in the main site content.
I've added a reference to a document which has a 'file' field in it to my simpleBlockContent input in Sanity Studio.
I've created a document schema for the PDF:
export default {
title: "PDF Upload",
name: "pdfDocument",
type: "document",
fields: [
{
name: "title",
type: "string",
title: "Title",
description: "This title will be used as a caption for the download.",
},
{
name: "pdfFile",
type: "file",
title: "PDF File",
options: {
accept: ".pdf",
},
validation: (Rule) => Rule.required(),
description: "Note that the file name will be visible to end users downloading the file.",
},
],
};
And I'm attempting to reference it in my input component's schema:
export default {
title: "Simple Block Content",
name: "simpleBlockContent",
type: "array",
of: [
{
title: "Block",
type: "block",
styles: [],
marks: {
annotations: [
{
name: "pdfLink",
type: "object",
title: "PDF download link",
fields: [
{
name: "pdfReference",
type: "reference",
title: "PDF Document",
to: [{ type: "pdfDocument" }],
},
],
},
],
},
},
],
};
However when I add pdfLink to my serializers.js in the frontend, nothing resembling a link to the file is present in the data passed to it from my _rawContent graphql query that handles all other page content.
How can I access the information needed to build a URL that links to the uploaded asset?

I've yet to do this in a serializer, but it looks as though the asset URL should be accessible in the returned document, according to the docs:
Example of returned asset document:
{
"_id": "image-abc123_0G0Pkg3JLakKCLrF1podAdE9-538x538-jpg",
"_type": "sanity.imageAsset", // type is prefixed by sanity schema
"assetId": "0G0Pkg3JLakKCLrF1podAdE9",
"path": "images/myproject/mydataset/abc123_0G0Pkg3JLakKCLrF1podAdE9-538x538.jpg",
"url": "https://cdn.sanity.io/images/myproject/mydataset/abc123_0G0Pkg3JLakKCLrF1podAdE9-538x538.jpg",
"originalFilename": "bicycle.jpg",
"size": 2097152, // File size, in bytes
"metadata": {
"dimensions": {
"height": 538,
"width": 538,
"aspectRatio": 1.0
},
"location":{ // only present if the original image contained location metadata
"lat": 59.9241370,
"lon": 10.7583846,
"alt": 21.0
}
}
}

I was looking for a way to get instant link in sanity studio when someone upload file in sanity and couldn't find any good solution so I came up with my own
Problem
let people upload files to sanity and get instant link that they can copy and paste in blog, case study etc
Solution
use slug as in option you have acces to doc where you can generate link my code
import { tryGetFile } from '#sanity/asset-utils'; // this function creates production link
const pdfUploader = {
name: 'pdfUploader',
title: 'Upload PDF and Get Link',
type: 'document',
preview: {
select: {
title: 'title',
},
},
fields: [
{
name: 'title',
title: 'Title',
description: 'Name displayed on pdf list',
type: 'string',
validation: (Rule) => [Rule.required()],
},
{
name: 'pdfFile',
title: 'Upload PDF File',
description: 'PDF File you want to upload, once you upload click generate URL',
type: 'file',
validation: (Rule) => [Rule.required()],
},
{
name: 'generatedPDFURL',
title: 'Generate URL Link to this pdf',
description:
'Click GENERATE to get Link to pdf file, if you by mistake change it, click generate again. Then Copy link below and paste it anywhere you want',
type: 'slug',
options: {
// this source takes all data that is currently in this document and pass it as argument
// then tryGetFile() - getting file from sanity with all atributes like url, original name etc
source: ({ pdfFile }) => {
if (!pdfFile) return 'Missing PDF File';
const { asset } = tryGetFile(pdfFile?.asset?._ref, {
// put your own envs
dataset: process.env.SANITY_DATASET,
projectId: process.env.SANITY_PROJECT_ID,
});
return asset?.url;
},
// this slugify prevent from changing "/" to "-" it keeps the original link and prevent from slugifying
slugify: (link) => link,
},
validation: (Rule) => [Rule.required()],
},
],
};
export default pdfUploader;
After this in sanity upload file and then click GENERATE to get link
Hope it helps people who are looking for similar solution, slug is not perfect choice but it's working :)

Related

Mimic the ( Show All ) link in datatables.net

I have a situation where I want to get the full (data) from the backend as a CSV file. I have already prepared the backend for that, but normally the front-end state => (filters) is not in contact with the backend unless I send a request, so I managed to solve the problem by mimicking the process of showing all data but by a custom button and a GET request ( not an ajax request ). knowing that I am using serverSide: true in datatables.
I prepared the backend to receive a request like ( Show All ) but I want that link to be sent by custom button ( Export All ) not by the show process itself as by the picture down because showing all data is not practical at all.
This is the code for the custom button
{
text: "Export All",
action: function (e, dt, node, config) {
// get the backend file here
},
},
So, How could I send a request like the same request sent by ( Show All ) by a custom button, I prepared the server to respond by the CSV file. but I need a way to get the same link to send a get request ( not by ajax ) by the same link that Show All sends?
If you are using serverSide: true that should mean you have too much data to use the default (serverSide: false) - because the browser/DataTables cannot handle the volume. For this reason I would say you should also not try to use the browser to generate a full export - it's going to be too much data (otherwise, why did you choose to use serverSide: true?).
Instead, use a server-side export utility - not DataTables.
But if you still want to pursuse this approach, you can build a custom button which downloads the entire data set to the DataTables (in your browser) and then exports that complete data to Excel.
Full Disclosure:
The following approach is inspired by the following DataTables forum post:
Customizing the data from export buttons
The following approach requires you to have a separate REST endpoint which delivers the entire data set as a JSON response (by contrast, the standard response should only be one page of data for the actual table data display and pagination.)
How you set up this endpoint is up to you (in Laravel, in your case).
Step 1: Create a custom button:
I tested with Excel, but you can do CSV, if you prefer.
buttons: [
{
extend: 'excelHtml5', // or 'csvHtml5'
text: 'All Data to Excel', // or CSV if you prefer
exportOptions: {
customizeData: function (d) {
var exportBody = getDataToExport();
d.body.length = 0;
d.body.push.apply(d.body, exportBody);
}
}
}
],
Step 2: The export function, used by the above button:
function GetDataToExport() {
var jsonResult = $.ajax({
url: '[your_GET_EVERYTHING_url_goes_here]',
success: function (result) {},
async: false
});
var exportBody = jsonResult.responseJSON.data;
return exportBody.map(function (el) {
return Object.keys(el).map(function (key) {
return el[key]
});
});
}
In the above code, my assumption is that the JSON response has the standard DataTables object structure - so, something like:
{
"data": [
{
"id": "1",
"name": "Tiger Nixon",
"position": "System Architect",
"salary": "$320,800",
"start_date": "2011/04/25",
"office": "Edinburgh",
"extn": "5421"
},
{
"id": "2",
"name": "Garrett Winters",
"position": "Accountant",
"salary": "$170,750",
"start_date": "2011/07/25",
"office": "Tokyo",
"extn": "8422"
},
{
"id": "3",
"name": "Ashton Cox",
"position": "Junior Technical Author",
"salary": "$86,000",
"start_date": "2009/01/12",
"office": "San Francisco",
"extn": "1562"
}
]
}
So, it's an object, containing a data array.
The DataTables customizeData function is what controls writing this complete JSON to the Excel file.
Overall, your DataTables code will look something like this:
$(document).ready(function() {
$('#example').DataTable( {
serverSide: true,
dom: 'Brftip',
buttons: [
{
extend: 'excelHtml5',
text: 'All Data to Excel',
exportOptions: {
customizeData: function (d) {
var exportBody = GetDataToExport();
d.body.length = 0;
d.body.push.apply(d.body, exportBody);
}
}
}
],
ajax: {
url: "[your_SINGLE_PAGE_url_goes_here]"
},
"columns": [
{ "title": "ID", "data": "id" },
{ "title": "Name", "data": "name" },
{ "title": "Position", "data": "position" },
{ "title": "Salary", "data": "salary" },
{ "title": "Start Date", "data": "start_date" },
{ "title": "Office", "data": "office" },
{ "title": "Extn.", "data": "extn" }
]
} );
} );
function GetDataToExport() {
var jsonResult = $.ajax({
url: '[your_GET_EVERYTHING_url_goes_here]',
success: function (result) {},
async: false
});
var exportBody = jsonResult.responseJSON.data;
return exportBody.map(function (el) {
return Object.keys(el).map(function (key) {
return el[key]
});
});
}
Just to repeat my initial warning: This is probably a bad idea, if you really needed to use serverSide: true because of the volume of data you have.
Use a server-side export tool instead - I'm sure Laravel/PHP has good support for generating Excel files.

Two doc pages in docusaurus [duplicate]

As I know, Docusaurus supports customized pages, but is there a way to have two docs in one Docusaurus project?
The original Navbar items have:
Docs
Blog
...
I want to have something like this:
Docs 1
Docs 2
Blog
...
I know I can make many subfolders just in one doc, but for some reason, I want a two Docs structure, which gives me a cleaner way to access docs.
If Docusaurus cannot offer this feature currently, I want to ask is there other documentation frameworks offer this feature?
You need to use the plugin-content-docs.
First, create the other docs folder, like docs, docs-api, docs-system.
(1) In your docusaurus.config.js file, configure your "default" docs:
(module.exports = { // start of the module.export declaration
[…]
presets: [
[
'#docusaurus/preset-classic',
{
docs: {
routeBasePath: 'docs',
path: 'docs',
sidebarPath: require.resolve('./sidebars.js'),
lastVersion: 'current',
onlyIncludeVersions: ['current'],
},
theme: {
customCss: require.resolve('./src/css/custom.css'),
},
},
],
],
[…]
}); // end of the module-export declaration
(2) Now, the magic!: in the same file, configure your other documents:
(module.exports = { // start of the module.export declaration
[…]
plugins: [
[…]
[
'#docusaurus/plugin-content-docs',
{
id: 'docs-api',
path: 'docs-api',
routeBasePath: 'docs-api',
sidebarPath: require.resolve('./sidebars.js'),
},
],
[
'#docusaurus/plugin-content-docs',
{
id: 'docs-system',
path: 'docs-system',
routeBasePath: 'docs-system',
sidebarPath: require.resolve('./sidebars.js'),
},
],
],
[…]
}); // end of the module-export declaration
(3) Now you probably want these documents in your NavBar, right? So add then!
(module.exports = { // start of the module.export declaration
[…]
navbar: {
hideOnScroll: true,
title: 'your title',
logo: {
alt: '',
src: 'img/favicon.ico',
},
items: [
{
to: '/docs/Intro', // ./docs/Intro.md
label: 'Docs Title',
position: 'left',
activeBaseRegex: `/docs/`,
},
{
to: '/docs-api/Intro', // ./docs-api/Intro.md
label: 'API',
position: 'left',
activeBaseRegex: `/docs-api/`,
},
{
to: '/docs-system/Introducao', // ./docs-system/Intro.md
label: 'My System',
position: 'left',
activeBaseRegex: `/docs-system/`,
},
],
},
[…]
}); // end of the module-export declaration
IMPORTANT
Sometimes you will modify your docusaurus.config.js and will not "work", so close the docusaurus service (just Ctrl+C in your terminal/power shell) and restart it -- I could have saved a few hours if a had known this before.
If you don't have the plugin-content-docs plugin, just install it:
npm install --save #docusaurus/plugin-content-docs
ROADMAP
I had a hard time figuring this out. What I did was download the whole docusaurus project, get the website part, trim everything that I did not need and this is what I got.
REFERENCES (Update 2022/03/02)
https://docusaurus.io/docs/docs-multi-instance
This solution worked for me. Using the 'autogenerated' sidebar in Docusaurus v2.0.0-beta.15
sidebars.js
/** #type {import('#docusaurus/plugin-content-docs').SidebarsConfig} */
const sidebars = {
// tutorialSidebar: [{type: 'autogenerated', dirName: '.'}],
newone: [{type: 'autogenerated', dirName: 'newone'}], // foldername
newtwo: [{type: 'autogenerated', dirName: 'newtwo'}], // foldername
};
module.exports = sidebars;
docusaurus.config.js
navbar: {
title: 'My Site',
logo: {
alt: 'My Site Logo',
src: 'img/logo.svg',
},
items: [
// {
// type: 'doc',
// docId: 'intro',
// position: 'left',
// label: 'Tutorials',
// },
{
type: 'docSidebar', // docSidebar
position: 'left',
sidebarId: 'newone', // foldername
label: 'NEWONE', // navbar title
},
{
type: 'docSidebar', // docSidebar
position: 'left',
sidebarId: 'newtwo', // foldername
label: 'NEWTWO', // navbar title
},
{to: '/blog', label: 'Blog', position: 'left'},
{
href: 'https://github.com/facebook/docusaurus',
label: 'GitHub',
position: 'right',
},
],
},
Your docs folder:
docs/
newone/
intro.md
newtwo/
intro.md
I tried this way and it's working.
[Edit 1]: But when I select API then both API and Docs in Navbar becomes green. Can you tell us what's the reason behind this #Yangshun Tay and can you suggest the edit for that?
[Edit 2]: I read the documentation, it's written in #docusaurus/theme-classic, if we set activeBasePath property then links with that common path (docs in this case) will have active attribute.
sidebar.js
module.exports = {
someSidebar: {
Docusaurus: ['doc1', 'doc2'],
Features: ['doc3']
},
someOtherSidebar: {
Test: ['mdx']
}
};
docusaurus.config.js
The navbar links are like this -
links: [
{
to: 'docs/doc1',
// activeBasePath: 'docs', // [Edit 3]
label: 'Docs',
position: 'left'
},
{
to: 'docs/mdx',
label: 'API',
position: 'left'
},
]
Folder structure of docs folder is like this -
docs
├── docs1.md
├── mdx.md
Regardless of whether you're using v1 or v2, the sidebars.js configuration can contain multiple keys, each having its own sidebar.
You need to use the doc type in docusaurus config. I think the "to" type is for pages not docs.
To make the sidebar correct, you need to also set the activeSidebarClassName value in the config to let it know which sidebar (among those you exported in the sidebars.js) you want to use for this doc.
activeSidebarClassName: 'navbar__link--active',
https://docusaurus.io/docs/api/themes/configuration#navbar-doc-link
Setting up Docusaurus to be multi-instance spans changes across many files. To make it easier to set up, I've created a base install with all the necessary changes to go multi-instance, and have released it as a GitHub template.
Fork it here:
mg0716/docusaurus-multi
Many of the changes in this repo were a result from #d-kastier's original comment.
Very open to feedback and pull requests, so feel free to give it a shot!
on my test, you MUST Include path "docs-xxxxxxxxx" ! do not create another name such "education" you will get page crash !

FullCalendar and ICalendar feeds with custom fields

I would like to load an ics file in fullcalendar (vuejs) with custom fields.
I find ics event in vuejs but without custom fields.
I use fullcalendar with fullcalendar/icalendar plugin to parsing ics file :
data() {
return {
selectedEvent: {},
calendarOptions: {
themeSystem: 'bootstrap',
editable: false,
plugins: [ dayGridPlugin, timeGridPlugin, listPlugin, bootstrapPlugin, iCalendarPlugin ],
initialView: 'timeGridWeek',
headerToolbar: {
left: 'prev,next today',
center: 'title',
right: 'dayGridMonth,timeGridWeek,timeGridDay',
},
}
},
events:{
url: 'my_ics_url',
format: 'ics',
},
eventClick: function(info) {
console.log(info.event)
},
}
}
}
It work, i find my events into calendar.
But I would like store my custom fields into extendedProps of the event. But actually I only find in extendedProps fields : description, location and organizer
My ics file :
BEGIN:VEVENT
SUMMARY:Evenement_A
UID:ABC-2021-03-26-12:30:00
STATUS:CONFIRMED
DTSTAMP:20210326T123000
DTSTART:20210326T123000
DTEND:20210326T153000
LAST-MODIFIED:19700101T010000
LOCATION:
SITE:
TITLE:event_title
CUSTOM_FIELDS:azerty
END:VEVENT
thanks for your help.
Michael

I am trying to export this JSON data for importing in the project however it shows unexpected syntax error

When it was in the same file as component worked perfectly. Now I am trying to exclude it from component to make project cleaner. However it shows unexpected syntax errors. File format is JSON. How can this code be fixed and work as it needs to be?
const postsData = [
{
id: 1,
title: "How to start a business with 100$",
published: "14h ago",
image: require("../img/img1.jpg"),
},
{
id: 2,
title: "Get funding for your startup",
published: "19h ago",
image: require("../img/img2.jpg"),
},
{
id: 3,
title: "Latest Fashion Trends for 2018",
published: "14h ago",
image: require("../img/img3.jpg"),
},
]
export {postsData};
Try this:
export default [
{
id: 1,
title: "How to start a business with 100$",
published: "14h ago",
image: require("../img/img1.jpg"),
},
{
id: 2,
title: "Get funding for your startup",
published: "19h ago",
image: require("../img/img2.jpg"),
},
{
id: 3,
title: "Latest Fashion Trends for 2018",
published: "14h ago",
image: require("../img/img3.jpg"),
}
]
and import it like
import postsData from 'filepath';
PS:
there is no naming convention for postsData you can name it
anything.
your json file should have an extension .js for your case because you're not using JSON objects
You need to add double "" mark
like this below
{
"id": "1",
"title": "How to start a business with 100$",
"published": "14h ago",
}
please use json lint to validate json- json lint
please follow
[{
"id": "1",
"title": "How to start a business with 100$",
"published": "14h ago",
"images": [{
"bannerImg1": "./images/effy.jpg"
},
{
"id": "2",
"title": "Get funding for your startup",
"published": "19h ago",
"images": [{
"bannerImg1": "./images/effy.jpg"
},
{
"id": "3",
"title": "Latest Fashion Trends for 2018",
"published": "14h ago",
"images": [{
"bannerImg1": "./images/effy.jpg"
}]
}
]
If it is a JSON file like MyFile.json, then you cant export any thing from JSON file with your current code as it contains keywords like export or const and you cant use JavaScript keywords in JSON file.
You can change the extension of the file to MyFile.js
OR
You can create a json file like MyFile.json and put only JSON code
{
"id": "1",
"title": "How to start a business with 100$",
"published": "14h ago",
}
and require it from JS file like :
import data from './MyFile.json'
This is not a JSON, it's just a javascript file (according to syntax).
The last line constructs object with shorthand property and exports it in the form export { name1, name2, …, nameN }; where name1, name2, ..., nameN are named properties.
To import named property use following syntax:
import { export } from "module-name";
In your case it would be
import { postsData } from "<your file without js ext>"
As said above,this is not a JSON but just a Object.
First,New a js file,you can name this file whatever you want.for example,data.js
Then write like this in data.js:
export defult{
postsData: [
{
id: 1,
title: "How to start a business with 100$",
published: "14h ago",
image: require("../img/img1.jpg"),
},
{
id: 2,
title: "Get funding for your startup",
published: "19h ago",
image: require("../img/img2.jpg"),
},
{
id: 3,
title: "Latest Fashion Trends for 2018",
published: "14h ago",
image: require("../img/img3.jpg"),
},
]
}
Finally,in your component file:
import {xxxx} from './data';
in this way,you can get your data,xxxx.postsData.
with commonjs module
const express = require("express");
const config = require("./config.json");
typescript
{
"compilerOptions": {
"target": "es2015",
"module": "commonjs",
"strict": true,
"moduleResolution": "node",
"resolveJsonModule": true
}
}
import law from './keywords/law.json'
import special from './keywords/special.json'
If you have .JSON file and everything inside it is json then you need not export it like normal module. You simply import it in other module and use it like normal object. Thats it !

Dojo Gridx with JsonStore

I'm trying to connect Gridx grid to JsonStore. The code and data is bellow. The problem is that Gridx is rendered correctly but it says: No items to display. Anybody know what I'm doing wrong? Dojo and Gridx are the latest versions installed with cpm.
edit: there is no ajax requet to /test/ in the Firebug/Chrom development tools
structure: [
{ field: 'id', name: 'Id' },
{ field: 'title', name: 'Title' },
{ field: 'artist', name: 'Artist' }
],
store: new JsonRestStore({
idAttribute: 'id',
target: '/test/'
}),
Data returned by /test is like this:
{
identifier: "id",
label: "title",
items: [
{
id: 1,
title: "Title 1",
artist: "Artist 1"
},
{
id: 2,
title: "Title 2",
artist: "Artist 2"
},
...
}
Grid is created with:
this.grid = new Grid({
structure: structure,
store: store,
modules: [
Pagination,
PaginationBar,
//Focus,
SingleSort,
ToolBar
],
//paginationInitialPage: 3,
paginationBarSizes: [10, 25, 50, 100],
paginationBarVisibleSteppers: 5,
paginationBarPosition: 'bottom'
}, this.gridNode);
have you specified which cache to use? In your case it should be an Async cache.
require([
'gridx/core/model/cache/Async',
.....
], function(Cache, ...){
this.grid = new Grid({
cacheClass: Cache,
......
});
I've found that this happens when the server doesn't return the Content-Range header in the response. Apparently the store isn't smart enough to just count the items in the returned array...