QWebEngine header issue - header

Using QWebEnginePage to download web pages. Works very well but sometimes it doesn't work with certain URLs and appears to be a header issue. I can't figure out what headers to send to not get this error.
Header:
#ifndef MAINWINDOW_H
#define MAINWINDOW_H
#include <QMainWindow>
#include <QApplication>
#include <QWebEnginePage>
namespace Ui {
class MainWindow;
}
class MainWindow : public QMainWindow
{
Q_OBJECT
public:
explicit MainWindow(QWidget *parent = nullptr);
~MainWindow();
private:
QWebEnginePage *p;
Ui::MainWindow *ui;
protected slots:
void getHtml(bool s);
void textChanged();
signals:
};
#endif // MAINWINDOW_H
Source
#include "mainwindow.h"
#include "ui_mainwindow.h"
#include <QApplication>
#include <QWebEnginePage>
#include <QWebEngineSettings>
#include <QWebEngineHttpRequest>
QWebEngineHttpRequest httpR;
MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow){
ui->setupUi(this);
p = new QWebEnginePage(this);
httpR.setHeader("Location", "absoluteURI");
httpR.setUrl(QUrl("https://ca.finance.yahoo.com/quote/AIPT/history?period1=1238597365&period2=1554130165&interval=1d&filter=history&frequency=1d"));
p->settings()->setAttribute(QWebEngineSettings::AutoLoadImages, false);
p->settings()->setAttribute(QWebEngineSettings::JavascriptEnabled, true);
connect(p, SIGNAL(loadFinished(bool)), this, SLOT(getHtml(bool)));
p->load(httpR);
}
QString html = "";
void MainWindow::getHtml(bool s){
p->toHtml(
[this](QString result) {
html=result;
this->textChanged();
qDebug()<<httpR.headers();
});
}
void MainWindow::textChanged(){
qDebug()<<html.size();
}
MainWindow::~MainWindow(){
delete ui;
}
Error message:
js: Unrecognized Content-Security-Policy directive 'disown-opener'.
js: Unrecognized Content-Security-Policy directive 'disown-opener'.
860801
QVector("Location")
js: Unrecognized Content-Security-Policy
directive 'disown-opener'.
js: Unrecognized Content-Security-Policy directive 'disown-opener'.
js: Unrecognized Content-Security-Policy directive 'disown-opener'.
js: The resource
https://pagead2.googlesyndication.com/pagead/js/r20190327/r20190131/show_ads_impl.js was preloaded using link preload but not used within a few seconds
from the window's load event. Please make sure it Please make sure it
has an appropriate as value and it is preloaded intentionally.

So I finally figured it out. It appears to fix the "Unrecognized Content-Security-Policy directive 'disown-opener'" error you would have to incorporate the html tag attribute "rel=noopener" in each link tag. However since I want to download html webpages from other sources it would be a bit hard to change each link.
Instead after some trial and error I found that this header "Upgrade: websocket" which fixed the issue.

Related

boost asio SSL stream.shutdown(ec); always had error which is boost::asio::ssl::error::stream_truncated

I tried to use Boost Asio and Openssl to post some json to a SSL server,my case was it always had problem which was has boost::asio::ssl::error::stream_truncated issue when I tried to shutdown the stream,now I tried to ignore the issue,I have no idea if I should ignore it or what wrong I have done?
boost version is 1.68.0,Openssl version 1.1.1, VS 2017 CE,Windows 7 x64,
here is my code
#include "root_certificates.hpp"
#include <boost/beast/core.hpp>
#include <boost/beast/http.hpp>
#include <boost/beast/version.hpp>
#include <boost/asio/connect.hpp>
#include <boost/asio/ip/tcp.hpp>
#include <boost/asio/ssl/error.hpp>
#include <boost/asio/ssl/stream.hpp>
#include <cstdlib>
#include <iostream>
#include <string>
#include <time.h>
#include<fstream>
#include <ctime>
#include <istream>
int postsslserver()
{
try
{
auto const host ="mydomain.com";
auto const port = "https";
auto const target ="/apps/postpage.html" ;
retcode = 0;
setlocale(LC_ALL, "");
pwmd5hashed = "mysecret";
std::string jsondata ="\"Double\":12.0000001,";
int version =11;
// The io_context is required for all I/O
boost::asio::io_context ioc;
// The SSL context is required, and holds certificates
ssl::context ctx{ ssl::context::sslv23_client };
//20181021
ctx.set_default_verify_paths();
// This holds the root certificate used for verification
//load_root_certificates(ctx);
// Verify the remote server's certificate
//ctx.set_verify_mode(ssl::verify_peer);
ctx.set_verify_mode(ssl::verify_none);
// These objects perform our I/O
tcp::resolver resolver{ ioc };
ssl::stream<tcp::socket> stream{ ioc, ctx };
// Set SNI Hostname (many hosts need this to handshake successfully)
if (!SSL_set_tlsext_host_name(stream.native_handle(), host))
{
boost::system::error_code ec{ static_cast<int>(::ERR_get_error()), boost::asio::error::get_ssl_category() };
throw boost::system::system_error{ ec };
}
// Look up the domain name
auto const results = resolver.resolve(host, port);
// Make the connection on the IP address we get from a lookup
boost::asio::connect(stream.next_layer(), results.begin(), results.end());
// Perform the SSL handshake
stream.handshake(ssl::stream_base::client);// error always occured this line of code,the error hints was "handshake: certificate verify failed"
// Set up an HTTP POST request message
http::request<http::string_body> req{ http::verb::post, target, version };
req.set(http::field::host, host);
req.set(http::field::user_agent, BOOST_BEAST_VERSION_STRING);
req.set(http::field::content_type, "application/json");
req.set(http::field::body, jsondata);
// Send the HTTP request to the remote host
http::write(stream, req);
// This buffer is used for reading and must be persisted
boost::beast::flat_buffer buffer;
// Declare a container to hold the response
http::response<http::dynamic_body> res;
// Receive the HTTP response
http::read(stream, buffer, res);
// Write the message to standard out
std::cout << res << std::endl;
// Gracefully close the stream
boost::system::error_code ec;
stream.shutdown(ec);// the problem was here! it always get boost::asio::ssl::error::stream_truncated issue
if (ec == boost::asio::error::eof)
{
// Rationale:
ec.assign(0, ec.category());
}
if (ec!= boost::asio::ssl::error::stream_truncated)//then I tried to ignore it
{
std::cout << ec.message()<< endl;
throw boost::system::system_error{ ec };
}
// If we get here then the connection is closed gracefully
}
catch (std::exception const& e)
{
//std::cerr << "Error: " << e.what() << std::endl;
write_text_to_log_file(e.what());
return EXIT_FAILURE;
}
return EXIT_SUCCESS;
}
thank you very much
The correct way to securely shutdown a SSL socket is not by directly trying to shutdown it.
First you have to cancel any possible outstanding operations then initiate a shutdown and close the socket afterwards.
Here's a snippet of a working solution:
virtual void OnClose(boost::system::error_code &ec)
{
//...
_socket.lowest_layer().cancel(ec);
_socket.async_shutdown(std::bind(&Socket<T>::ShutdownHandler, this->shared_from_this(), std::placeholders::_1));
//...
}
void ShutdownHandler(const boost::system::error_code &ec)
{
//On async_shutdown both parties send and receive a 'close_notify' message.
//When the shutdown has been negotiated by both parties, the underlying
//transport may either be reused or closed.
//The initiator of the shutdown will enter the shutdown-handler with an
//error value of eof. (Socket was securely shutdown)
if (ec && ec != boost::asio::error::eof)
{
LogError(ec, Error::ErrorLocation(__FUNCTION__, __LINE__));
}
boost::system::error_code ec2;
_socket.lowest_layer().close(ec2);
if (ec2)
LogError(ec2, Error::ErrorLocation(__FUNCTION__, __LINE__));
_shutdownComplete.exchange(true);
}
Also: boost asio ssl async_shutdown always finishes with an error?

EOFException: null while applying gzip to dropwizard along with Filter

We are facing exception using dropwizard-core:1.1.2 while trying to add gzip content-encoding at service response headers. The details are as follows:
GzipFilter.class
public class GzipFilter implements Filter {
public void doFilter(ServletRequest req, ServletResponse res, FilterChain chain)
throws IOException, ServletException {
HttpServletResponse response = (HttpServletResponse) res;
response.setHeader("Content-Encoding", "gzip");
chain.doFilter(req, res);
}
public void init(FilterConfig filterConfig) {
}
public void destroy() {
}
}
Service.class
#Override
public void run(DocumentServiceConfig config, Environment environment) throws Exception {
Injector injector = createInjector(config, environment);
environment.jersey().register(injector.getInstance(SomeResource.class));
environment.servlets().addFilter("Gzip-Filter", GzipFilter.class).addMappingForUrlPatterns(EnumSet.allOf(DispatcherType.class), true, "/*");
config.yml
gzip:
enabled: true
minimumEntitySize: 256B
bufferSize: 32KB
Exception stack trace for 500 API response -
WARN [2017-08-04 00:48:20,713] org.eclipse.jetty.server.HttpChannel: /clients/v2
! java.io.EOFException: null
! at java.util.zip.GZIPInputStream.readUByte(GZIPInputStream.java:268)
! at java.util.zip.GZIPInputStream.readUShort(GZIPInputStream.java:258)
! at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:164)
! at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:79)
! at io.dropwizard.jetty.BiDiGzipHandler.wrapGzippedRequest(BiDiGzipHandler.java:100)
! at io.dropwizard.jetty.BiDiGzipHandler.handle(BiDiGzipHandler.java:64)
! at org.eclipse.jetty.server.handler.RequestLogHandler.handle(RequestLogHandler.java:56)
! at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:169)
! at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
! at org.eclipse.jetty.server.Server.handle(Server.java:564)
! at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
! at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
! at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:279)
! at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:110)
! at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:124)
! at org.eclipse.jetty.util.thread.Invocable.invokePreferred(Invocable.java:122)
! at org.eclipse.jetty.util.thread.strategy.ExecutingExecutionStrategy.invoke(ExecutingExecutionStrategy.java:58)
! at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:201)
! at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:133)
! at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:672)
! at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:590)
! at java.lang.Thread.run(Thread.java:745)
Confused though whether or not I should answer this myself. Yet, since the details for the update are semi resolving the issue hence answering this myself.
Went ahead describing the same to Dropwizard#Issues#2126
Quoting #arteam here to provide the solution to the current implementation.
I believe Dropwizard does gzip compression automatically. The support
for gzip is enabled by default (see
http://www.dropwizard.io/1.1.2/docs/manual/configuration.html#gzip).
So, if the client supports decompression by sending a request with the
Accept-Encoding:gzip header,
org.eclipse.jetty.server.handler.gzip.GzipHandler will compress the
response and add the Content-Encoding: gzip header.
Well the question still remains though, for which I am still not marking this as an answer to this question:
Why your custom filter doesn't work is not clear, maybe your filter is executed before the Jersey servlet and it rewrites the header.
So all that was needed was to implement the service.yml changes as:
gzip:
enabled: true
minimumEntitySize: 256B
bufferSize: 32KB
and not implement any CustomFilter which ends up overriding the current implementation and not just override but result in the titled exception.
Another point to note is, that this shall be tested against the response size of both more and less than the minimumEntitySize as specified in the configuration.

Wher to keep files to make it accessible in apache tomcat?

I have a file named index.jsp that contains a form, form action is a .java file named getAttr.java . Inside getAttr.java servlet I am writing code that will download a .jar file located in same directory as WEB-INF.
getJar.java code:
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
response.setContentType("application/jar");
ServletContext context=getServletContext();
InputStream inputStream=context.getResourceAsStream("eee.jar");
int read=0;
byte bytes[]=new byte[1024];
OutputStream outputStream=response.getOutputStream();
while((read=inputStream.read(bytes))!=-1){
outputStream.write(bytes,0,read);
}
outputStream.flush();
RequestDispatcher dispatcher=request.getRequestDispatcher("getJar.jsp");
dispatcher.forward(request, response);
outputStream.close();
}
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
// TODO Auto-generated method stub
doGet(request, response);
}
getJar.jsp code:
<!-- necessary html code here !>
<body>
<%
// contains no code.
//I don't know what to write here
%>
</body>
<!-- necessary html code here !>
I am coding J2EE in eclipse neon. My directory structure
Now my problem is:
User clicks a button in index.jsp which sends him to getAttr.java servlet.
getAttr servlet shows this error:
perhaps eee.jar is not accessible. Where to keep this .jar file?
Ask me if more information is needed.

Glassfish 4, JSF 2.2 and PrimeFaces FileUploadEvent not working together

After upgrading to GlassFish 4 and JSF 2.2 Primefaces FileUploadEvent stop working. With JSF 2.1 it was working with no problem. Everything is working fine except file uploading. Is there something that I am missing?
GlassFish 4
JSF 2.2
PrimeFaces 3.4.2 and 3.5
Commons io version: 2.4
Commons fileupload version: 1.3
Controller side
public void handleFileUpload(FileUploadEvent event) {
System.out.println("HandleFileUpload");
byte[] file = event.getFile().getContents();
newFieldset.setData(file);
FacesMessage msg = new FacesMessage("Succesful", event.getFile().getFileName() + " is uploaded.");
FacesContext.getCurrentInstance().addMessage(null, msg);
}
View
<h:form enctype="multipart/form-data">
<p:fieldset legend="Create new feed" toggleable="true" collapsed="true" >
<p:fileUpload fileUploadListener="#{adminHomeController.handleFileUpload}" style="margin-top: 20px;"
mode="advanced"
update="messages"
sizeLimit="1000000"
multiple="false"
allowTypes="/(\.|\/)(gif|jpe?g|png)$/"/>
<p:inputText label="Baslik" style="margin-top: 20px;" required="true" value="#{adminHomeController.newFieldset.legend}" />
<p:editor style="margin-top: 20px;"
value="#{adminHomeController.newFieldset.content}" />
<p:commandButton style="margin-top: 20px;" value="#{msg['common.save']}" update="messages" icon="ui-icon-disk" actionListener="#{adminHomeController.saveFieldset()}"/>
</p:fieldset>
<p:growl id="messages" showDetail="true"/>
</h:form>
I was finally able to figure it out. Commons-fileuploads method parseRequest(httpServletRequest) tries to read the request's inputStream. Since the container already read it, it is empty. So what can be done to solve this? The answer is a bit more complicated than I initially thought it would be. First you will need your own FileUploadFilter which could look like this:
public class FileUploadFilter implements Filter
{
private final static Logger LOGGER = LoggerFactory.getLogger(FileUploadFilter.class);
/*
* (non-Javadoc)
*
* #see javax.servlet.Filter#init(javax.servlet.FilterConfig)
*/
#Override
public void init(FilterConfig filterConfig) throws ServletException
{
}
/*
* (non-Javadoc)
*
* #see javax.servlet.Filter#doFilter(javax.servlet.ServletRequest,
* javax.servlet.ServletResponse, javax.servlet.FilterChain)
*/
#Override
public void doFilter(ServletRequest request, ServletResponse response, FilterChain filterChain) throws IOException, ServletException
{
HttpServletRequest httpServletRequest = (HttpServletRequest) request;
boolean isMultipart = (httpServletRequest.getContentType() == null) ? false : httpServletRequest.getContentType().toLowerCase().startsWith("multipart/");
if (isMultipart)
{
MultipartRequest multipartRequest = new MultipartRequest(httpServletRequest);
LOGGER.info("File upload request parsed succesfully, continuing with filter chain with a wrapped multipart request");
filterChain.doFilter(multipartRequest, response);
}
else
{
filterChain.doFilter(request, response);
}
}
/*
* (non-Javadoc)
*
* #see javax.servlet.Filter#destroy()
*/
#Override
public void destroy()
{
LOGGER.info("Destroying UploadFilter");
}
Next: Register this filter in your web.xml and remove/replace the Primefaces filter. This should look something like this:
<filter>
<filter-name>FileUpload Filter</filter-name>
<filter-class><YourPackage>.FileUploadFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>FileUpload Filter</filter-name>
<servlet-name>Faces Servlet</servlet-name>
</filter-mapping>
Unfortunately thats not it. You will need your own MultipartRequest since you have to assemble the list of FileItems by yourself. But Stop. We have to work with the javax.servlet.Part classes which are not compatible with the FileItem. So i wrote a new class which bridges these two. You can find this class here: http://pastebin.com/JcfAYjey
The last piece of the puzzle is the mentioned MultipartRequest which links the PartItem and the FileUploadFilter. I took this class from the Primefaces-Repository and changed it according to out needs (see http://pastebin.com/Vc5h2rmJ). The difference is between lines 47 and 57.
So what do you have to do:
1. Create the three classes FileUploadFilter, MultipartRequest and PartItem
2. Register the FileUploadFilter in your web.xml
3. Enjoy!
PLEASE NOTE: This is not intended as a solve-all-problems solution but a merely a direction you may take in further implementations. The MultipartRequest for example will only work for parts with content-type image/*. You may need to change this.
Feel free to change the code ;) Hope it helps!
EDIT: I forgot to mention one important step. You will additionally need your Own FileIUploadRenderer. The one Primefaces implemented uses an instanceof check to find the MultipartRequest. Since you are now using a different one the import has to be changed. The rest of the class can stay the same (http://pastebin.com/rDUkPqf6). Don't forget to register it inside of your faces-config.xml :
<render-kit>
<renderer>
<component-family>org.primefaces.component</component-family>
<renderer-type>org.primefaces.component.FileUploadRenderer</renderer-type>
<renderer-class><YourPackage>.FileUploadRenderer</renderer-class>
</renderer>
</render-kit>
Answer lies in UploadFile getInputstream() method. Don't rely on getContents() method.
This is my simple solution which worked with the below dependencies in glassfish 4
Primefaces 4.0.RC1
jsf 2.2
commons-fileupload 1.3
private byte[] getFileContents(InputStream in) {
byte[] bytes = null;
try {
// write the inputStream to a FileOutputStream
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int read = 0;
bytes = new byte[1024];
while ((read = in.read(bytes)) != -1) {
bos.write(bytes, 0, read);
}
bytes = bos.toByteArray();
in.close();
in = null;
bos.flush();
bos.close();
bos = null;
logger.debug("New file created!");
} catch (IOException e) {
System.out.println(e.getMessage());
}
return bytes;
}
getFileContents(getFile().getInputstream());
Try to delete beans.xml (CDI configuration file) and use JSF beans.
I saw on PrimeFaces blog that full JSF 2.2 support will be as of version 4.0.
See 3.5 is missing dependency - so won't launch
I think it's a commons-fileupload issue. When I debug through the code, the PrimeFaces' UploadFilter triggers correctly the commons-fileupload's FileUploadBase.parseRequest method (identically flow when I use GlassFish 3.1.22 or GlassFish 4), but the check on FileItemIterator.hasNext returns false.

Link error in a C++/CLI application linking a static C++/CLI library

I have a static C++/CLI library in which the following class is defined:
ObjectWrapper.h:
public ref class CObjectWrapper: System::Object
{
public:
CObjectWrapper(CObject& wrappedObject);
explicit operator CObject*();
private:
CObject& m_WrappedObject;
};
ObjectWrapper.cpp:
#include "stdafx.h"
#include "BasicObjectWrapper.h"
CObjectWrapper::CObjectWrapper(CObject& wrappedObject)
: WrappedObject(wrappedObject)
{ }
CObjectWrapper::operator CObject*()
{
return &WrappedObject;
}
I have a C++/CLI application which is linked to the static library. The following errors arise at link:
Error 3 error LNK2020: unresolved token (06000007) CObjectWrapper::.ctor KCBrowserInEcrinView.obj
Error 4 error LNK2020: unresolved token (06000005) CObjectWrapper::.ctor KCBrowserLibD9.lib
Error 5 error LNK2020: unresolved token (06000008) CObjectWrapper::op_Implicit KCBrowserInEcrinView.obj
Error 6 error LNK2020: unresolved token (06000006) CObjectWrapper::op_Implicit KCBrowserLibD9.lib
I solved the problem by moving the implementation (previously located in the .cpp file) to the .h file. I don't understand why.
I would highly appreciate if anybody could bring any explication.