Blank pdf output using jsreport - sql

I know this has got to have some super easy answer but I am just beginning with this stuff and followed this tutorial: https://jsreport.net/blog/pdf-reports-in-sql-server
Except it gives me a blank page as an output with no real error message.
I'm using handlebars and chrome-pdf to do this...
My database connection script is:
const sql = require('mssql');
const config = {
"user": "user",
"password": "password",
"server": "server",
"database": "database"
}
async function beforeRender(req, res) {
await sql.connect(config)
const sqlReq = new sql.Request();
const recordset = await sqlReq.query(
`SELECT DBVersion
,MinAppVersion
FROM VersionTbl`
)
Object.assign(req.data, {Versions: recordset });
}
I can see it running and connecting properly in the debug tab... and my sql query seems to be correct if I test it directly in the database server.
My template looks like this:
<table>
{{#each Version}}
<tr>
<td>{{DBVersion}}</td>
<td>{{MinAppVersion}}</td>
</tr>
{{/each}}
</table>
The debug log...
+0 Starting rendering request 27 (user: null)
+2 Rendering template { name: PQRTemplate, recipe: chrome-pdf, engine: handlebars, preview: true }
+2 Data item not defined for this template.
+9 Resources not defined for this template.
+10 Executing script Connection using dedicated-process strategy
+779 Base url not specified, skipping its injection.
+780 Rendering engine handlebars using dedicated-process strategy
+937 Compiled template not found in the cache, compiling
+951 Executing recipe chrome-pdf
+1041 Converting with chrome HeadlessChrome/79.0.3945.0 using dedicated-process strategy
+1115 Page request: GET (document) file:///C:/Users/********/AppData/Local/Temp/jsreport/autocleanup/264c975a-9ef2-4130-960c-84eeae2ec04a-chrome-pdf.html
+1122 Page request finished: GET (document) file:///C:/Users/*******/AppData/Local/Temp/jsreport/autocleanup/264c975a-9ef2-4130-960c-84eeae2ec04a-chrome-pdf.html
+1124 Running chrome with params {"printBackground":true,"margin":{}}
+1327 Skipping storing report.
+1327 Rendering request 27 finished in 1327 ms
Can any of you smart people tell me what's going wrong? Thanks

See the answer from Jan_blaha here... https://forum.jsreport.net/topic/1814/blank-pdf-output
It perfectly solved my issue with the tutorial.

Related

SharePoint Online Threshold Error after Uploading file and modifying column values (SpFx)

We are facing issue in update of Library column values post upload of file. This issue has started occurring once the record in library exceeded 5,000 items.
Please share a code sample which works with library having more than 5,000 items most resources are available for library having less than 5,000 items;
Sample Resource: https://www.c-sharpcorner.com/blogs/file-upload-and-metadata-updation-in-spfx-with-pnpjsreact
Error occurs on line: f.file.getItem().then(item => {
.files.add(fileName, element.content, true)
.then(f => {
f.file.getItem().then(item => {
In the console this URL is shown: /_api/web/getFolderByServerRelativeUrl('')/files()/listItemAllFields
I tested above code snippet in a demo SPFx web part with a large library(over 7000 items) but works fine.
const file = await sp.web.getFolderByServerRelativeUrl("/sites/sbdev/My test doc lib/docs").files.add(filename, fileContent, true);
const item = await file.file.getItem();
await item.update({
Title: "A Title"+ (new Date()).toLocaleDateString(),
uuId: 18
});
Please take a reference of the below sample:
https://github.com/kongmengfei/SharedSPFx/blob/main/pnpjsuploadfiles/src/webparts/pnpjsuploadfiles/components/Pnpjsuploadfiles.tsx
https://pnp.github.io/pnpjs/sp/files/#setting-associated-item-values
BR

Error -32 EPIPE broken pipe

I am doing a post request with ajax that should return a partialview but I always get following error in log:
Connection id "0HL6PHMI6GKUP" communication error.
Microsoft.AspNetCore.Server.Kestrel.Internal.Networking.UvException: Error -32 EPIPE broken pipe
When looking at the debug log, I see that it is loading the partialview data but than I get the error.
I can't find anything on the net about the -32 EPIPE error, could someone help me explain what this error means?
Ajax call
$( "#PostForm" ).submit(function( event ) {
//Ajax call
$.ajax({
type: 'POST',
url: "/url/path/CreateBox",
data: {
"id": $("#RackId").val(),
"Name": $("#Name").val()
},
success: function(result){
$("#modal").html(result);
}
});
});
Controller
[HttpPost]
public async Task<IActionResult> CreateBox(int id, string Name)
{
//Get the info of the given ID
Rack rack = await this._rackAccess.GetByIdAsync(id);
if (rack == null)
{
return NotFound();
}
Box box = new Box();
box.Rack = rack;
if (!string.IsNullOrEmpty(Name))
{
box.Name = Name;
var result = await this._boxAccess.InsertAsync(box);
//Returns a list of boxes
return PartialView("Boxes", await this._boxAccess.ToRackListAsync(rack.ID));
}else{
//Returns form again
return PartialView("CreateBox", box);
}
}
Version
Aspnet core: 1.1.0
"Microsoft.AspNetCore.Server.Kestrel": "1.1.0"
"Microsoft.AspNetCore.Hosting": "1.1.0",
Solution can be found on github were I posted the problem aswell:
https://github.com/aspnet/KestrelHttpServer/issues/1978
Answer of halter73 on github:
The "communication error" usually comes in ECONNRESET, EPIPE, and ECANCELED varieties. Which one you get usually just depends on which platform you're running on, but all three generally mean the same thing: that the client closed the connection ungracefully.
I have a theory why this is happening. I think that the page might be getting reloaded mid-xhr causing the xhr to get aborted. This can be fixed by returning false from your jQuery submit callback.
I took all your dependencies and your jQuery snippet and demonstrated how this page reload can cause an EPIPE on linux and an ECANCELED on Windows in a sample repro at https://github.com/halter73/EPIPE. It uses csproj instead of project.json because I don't have an old CLI that supports project.json easily available.
Maybe due to long time processing on server-side,
communication pipe was broken by overtime mechanism.
Wish this is helpful.

Force reload cached image with same url after dynamic DOM change

I'm developping an angular2 application (single page application). My page is never "reloaded", but it's content changes according to user interactions.
I'm having some cache problems especially with images.
Context :
My page contains an editable image list :
<ul>
<li><img src="myImageController/1">Edit</li>
<li><img src="myImageController/2">Edit</li>
<li><img src="myImageController/3">Edit</li>
</ul>
When i want to edit an image (Edit link), my dom content is completly changed to show another angular component with a fileupload component.
The myImageController returns the LastModified header, and cache-control : no-cache and must-revalidate.
After a refresh (hit F5), my page does a request to get all img src, which is correct : if image has been modified, it is downloaded, if not, i just get a 304 which is fine.
Note : my images are stored in database as blob fields.
Problem :
When my page content is dynamically reloaded with my single page app, containing img tags, the browser do not call a GET http request, but immediatly take image from cache. I assume this a browser optimization to avoid getting the same resource on the same page multiple times.
Wrong solutions :
The first solution is to add something like ?time=(new Date()).getTime() to generate unique urls and avoid browser cache. This won't send the If-Modified-Since header in the request, and i will download my image every time completly.
Do a "real" refresh : the first page load in angular apps is quite slow, and i don't to refresh all.
Tests
To simplify the problem, i trying to create a static html page containing 3 images with the exact same link to my controller : /myImageController/1. With the chrome developper tool, i can see that only one get request is called. If i manage to get mulitple server calls in this case, it would probably solve my problem.
Thank you for your help.
5th version of HTML specification describes this behavior. Browser may reuse images regardless of cache related HTTP headers. Check this answer for more information. You probably need to use XMLHttpRequest and blobs. In this case you also need to consider Same-origin policy.
You can use following function to make sure user agent performs every request:
var downloadImage = function ( imgNode, url ) {
var xhr = new XMLHttpRequest();
xhr.open("GET", url, true);
xhr.responseType = "blob";
xhr.onreadystatechange = function () {
if (xhr.readyState == 4) {
if (xhr.status == 200 || xhr.status == 304) {
var blobUrl = URL.createObjectURL(xhr.response);
imgNode.src = blobUrl;
// You can also use imgNode.onload callback to release blob resources.
setTimeout(function () {
URL.revokeObjectURL(blobUrl);
}, 1000);
}
}
};
xhr.send();
};
For more information check New Tricks in XMLHttpRequest2 article by Eric Bidelman, Working with files in JavaScript, Part 4: Object URLs article by Nicholas C. Zakas and URL.createObjectURL() MDN page and Same-origin policy MDN page.
You can use the random ID trick. This changes the URL so that the browser reloads the image. Not that this can be done in the query parameters to force a full cache break or in the hash to allow the browser to re-validate the image from the cache (and avoid re-downloading it if unchanged).
function reloadWithCache(img: HTMLImageElement, url: string) {
img.src = url.replace(/#.*/, "") + "#" + Math.random();
}
function reloadBypassCache(img: HTMLImageElement, url: string) {
let sep = img.indexOf("?") == -1? "?" : "&";
img.src = url + sep + "nocache=" + Math.random()
}
Note that if you are using reloadBypassCache regularly you are better off fixing your cache headers. This function will always hit your origin server leading to higher running costs and making CDNs ineffective.

Google Apps Script: Salesforce API Call

Just finished breakfast and already hit a snag. I'm trying to call the salesforce REST api from my google sheets. I've written a working script locally in python, but converting it into JS, something went wrong:
function authenticateSF(){
var url = 'https://login.salesforce.com/services/oauth2/token';
var options = {
grant_type:'password',
client_id:'XXXXXXXXXXX',
client_secret:'111111111111',
username:'ITSME#smee.com',
password:'smee'
};
var results = UrlFetchApp.fetch(url, options);
}
Here is the error response:
Request failed for https://login.salesforce.com/services/oauth2/token
returned code 400. Truncated server response:
{"error_description":"grant type not
supported","error":"unsupported_grant_type"} (use muteHttpExceptions
option to examine full response) (line 12, file "Code")
Mind you, these exact parameters work fine in my local python script (putting the key values inside quotations).
Here are the relevant docs:
Google Script: Connecting to external API's
Salesforce: REST API guide
Thank you all!
Google's UrlFetchApp object automatically defaults to a GET request. To authenticate, you have to explicitly set in the options the method "post":
function authenticateSF(){
var url = 'https://login.salesforce.com/services/oauth2/token';
var payload = {
'grant_type':'password',
'client_id':'XXXXXXXXXXX',
'client_secret':'111111111111',
'username':'ITSME#smee.com',
'password':'smee'
};
var options = {
'method':'post',
'payload':payload
};
var results = UrlFetchApp.fetch(url, options);
}

Why doesn't dojo.io.script.get() execute the provided error function when receiving a 404?

I am trying to use the following to do a cross-domain get:
dojo.io.script.get({
url: myUrl,
callbackParamName: "callback",
preventCache: true,
load: dojo.hitch( this, loadFunction ),
error: dojo.hitch( this, function() {
console.log('Error!!!');
})
});
The load function runs fine, however, when the server returns a 404, the error function does not run. Can anyone tell me why?
EDIT
After some investigation, I found that a timeout and handler could be implemented in the following way:
dojo.io.script.get({
url: myUrl,
callbackParamName: "callback",
timeout: 2000
}).then(function(data){
console.log(data);
}, function(error){
alert(error);
});
This uses functionality provided by the dojo.Deferred object.
When accessing server with script tags (that what dojo.io.script.get does), status code and headers are not available.
You may try some other ways to detect a problem, like using a timeout and analyzing a content of a script. The latter is problematic for JSONP calls (like in your example).
I realize this is old but I thought I'd share a solution in case others, like I had, come across this thread.
dojo.io.script is essentially adding a <script/> to your html page. So you can try this:
var script = document.createElement('script');
script.setAttribute('type', 'text/javascript');
script.setAttribute('src', myUrl);
script.onerror = function() {
debugger
}
script.onload = function() {
debugger
}
document.getElementsByTagName('body')[0].appendChild(script);
That way if the script fails to load the onerror event is called.
*This may not work in every instance but is a good start