Issue with Spring Security, Spring Webflow, file uploads and UTF-8 - file-upload

I have a problem very similar to the one described here: File Upload using Spring WebFlow 2.4.0, parameter not binded, but that one didn't mention anything about UTF-8 issues. I'm using Spring Framework 4.1.6, Spring Security 4.0.2 and Spring Webflow 2.4.2.
It revolves around StandardServletMultipartResolver vs. CommonsMultipartResolver as far as I can tell, but I'm not sure. If I use CommonsMultipartResolver I can upload files on any page except for Webflow pages fine and UTF-8 encoding works as well on all pages. However on the Webflow pages an exception is thrown trying to access the file . If I use StandardServletMultipartResolver then all of the file uploads work, including Webflow, but on any page that has a UTF-8 character, e.g., caractère, I get garbage.
The wierd thing is I can see in FireBug that the file is being posted when I use the commons resolver. Also, if I debug the RequestContext coming from Webflow I can also see the file buried 4 levels deep in requests. The code for the common resolver (see end of post for the standard resolver code):
public FileResult uploadFile(Recipe recipe, RequestContext requestContext) {
ServletExternalContext context = (ServletExternalContext) requestContext.getExternalContext();
MultipartHttpServletRequest multipartRequest = new DefaultMultipartHttpServletRequest((HttpServletRequest)context.getNativeRequest());
MultipartFile file = multipartRequest.getFile("file");
So, is this a Spring Security issue or a Spring Webflow problem? I suspect the commons resolver would work if I could cast the RequestContext above correctly, but I've tried numerous combinations with no luck. Any guidance on this would be greatly appreciated.
Here are some relevant configurations and code:
WebMvcConfig
#Bean
public CommonsMultipartResolver filterMultipartResolver() {
CommonsMultipartResolver resolver = new CommonsMultipartResolver();
resolver.setDefaultEncoding("UTF-8");
return resolver;
}
SecurityConfig
#Override
protected void configure(HttpSecurity http) throws Exception {
CharacterEncodingFilter characterEncodingFilter = new CharacterEncodingFilter();
characterEncodingFilter.setEncoding("UTF-8");
characterEncodingFilter.setForceEncoding(true);
http
//.csrf().disable()
.addFilterBefore(characterEncodingFilter, CsrfFilter.class)
...more settings...
SecurityInitializer
#Override
protected void beforeSpringSecurityFilterChain(ServletContext servletContext) {
insertFilters(servletContext, new MultipartFilter());
}
Webflow Action
<action-state id="uploadFile">
<evaluate expression="fileActions.uploadFile(recipe, flowRequestContext)"/>
<transition to="review"/>
</action-state>
Upload file method
public FileResult uploadFile(Recipe recipe, RequestContext requestContext) {
ServletExternalContext context = (ServletExternalContext) requestContext.getExternalContext();
MultipartHttpServletRequest multipartRequest = new StandardMultipartHttpServletRequest((HttpServletRequest)context.getNativeRequest());
MultipartFile file = multipartRequest.getFile("file");
...rest of code to save the file...

Turns out you can cast the RequestContext to get at the underlying MultipartHttpServletRequest but it's not pretty. Here's what I ended up with:
Upload file method
public FileResult uploadFile(Recipe recipe, RequestContext requestContext) {
logger.debug("uploadFile");
ServletExternalContext context = (ServletExternalContext) requestContext.getExternalContext();
SecurityContextHolderAwareRequestWrapper wrapper1 = (SecurityContextHolderAwareRequestWrapper)context.getNativeRequest();
HttpServletRequestWrapper wrapper2 = (HttpServletRequestWrapper)wrapper1.getRequest();
FirewalledRequest firewall = (FirewalledRequest)wrapper2.getRequest();
MultipartHttpServletRequest multipartRequest = (DefaultMultipartHttpServletRequest)firewall.getRequest();
MultipartFile file = multipartRequest.getFile("file");
...rest of code to save the file...
Using this I get to keep the CommonsMultipartResolver, all file uploads in the app work whether Webflow or not, and I have no issues with UTF-8 and character mangling.
I'm not particularly happy with this solution (even though it works) since it's dependent upon a specific nesting of requests that could change in the future(?). I'm be interested if anyone else has run into the same UTF-8 issue and how they solved it, but for now I'm going to test the heck out of this and move on.

Related

url was not normalized error when using intellij but not when using STS

The developed website works fine on remote server and local machine (when using STS IDE) , recently I started use Intellij IDEA (I created a duplicate of the website code with no any changes ), I started getting the URL was not normalized error.
Does intellij handles Spring security somehow differently than STS ? or what could be the cause?
I don't want use custom httpfirewal .
#EnableGlobalMethodSecurity(prePostEnabled=true)
#Configuration
#EnableWebSecurity
public class SecurityConfiguration extends WebSecurityConfigurerAdapter{
#Override
protected void configure(AuthenticationManagerBuilder auth) throws Exception {
auth.authenticationProvider(authenticationProvider())
.jdbcAuthentication()
.usersByUsernameQuery(usersQuery)
.authoritiesByUsernameQuery(rolesQuery)
.dataSource(dataSource);
}
#Override
protected void configure(HttpSecurity http) throws Exception {
// URLs matching for access rights
http.authorizeRequests()
.antMatchers( "/", "/contact","/register").permitAll()
.antMatchers("/accounts").hasAnyAuthority("SUPER_USER","ADMIN_USER")
.anyRequest().authenticated()
.and()
// form login
.csrf().disable().formLogin()
.loginPage("/index")
.failureUrl("/index?error=true")
.defaultSuccessUrl("/user")
.usernameParameter("email")
.passwordParameter("password")
.and()
// logout
.logout()
.logoutRequestMatcher(new AntPathRequestMatcher("/logout"))
.logoutSuccessUrl("/").and()
.exceptionHandling()
.accessDeniedPage("/access-denied");
}
#Override
public void configure(WebSecurity web) throws Exception {
web.ignoring().antMatchers("/resources/**", "/static/**", "/css/**", "/js/**", "/images/**");
}
and this is from the properties :
# Spring MVC view prefix.
spring.mvc.view.prefix=/templates/
# Spring MVC view suffix.
spring.mvc.view.suffix=.html
the error is :
org.springframework.security.web.firewall.RequestRejectedException: The request was rejected because the URL was not normalized.
P.S: I'm using JDK8 ,Spring Boot 2,Spring Security ,thymeleaf,intellij U 2019.2
org.springframework.security.web.firewall.RequestRejectedException: The request was rejected because the URL was not normalized.
Which IDE to use should not have any differences for running the same source codes on the embeddable server configured by springboot. This error happens when the HTTP requests that send to server is not normalised which the URL contains character sequences like ./, /../ , // or /. So I doubt that it is due to you are using different URL to browse the app. For example, you are accidentally adding a '/' in the URL such as http://127.0.0.1:8080/app//index.html
You can change to use a less secure HttpFirewall to avoid such checking by :
#Override
public void configure(WebSecurity web) throws Exception {
web.httpFirewall(new DefaultHttpFirewall());
//another configuration .....
}
P.S. Though it is called DefaultHttpFirewall , it is not the default HttpFirewall used by Spring Security since 4.2.4 which is less secured than the actual default StrictHttpFirewall

ABCpdf - Download PDF with .NET Core 2.1 - HttpContext/HttpResponse

I'm creating a web page that will allow the user to download a report as a PDF using ABCpdf. But reading the documentation, the only options I see are by using doc.Save("test.pdf") (which saves the file on the server that is hosting the application) or using 'HttpContext.Current.ApplicationInstance.CompleteRequest();' (which saves on the client side, which is what I want, but HttpContext.Current is not available on .NET Core.
The band-aid solution I have is with the doc.Save(), I would save the file on the server then send a link to the view which then downloads it from the server. A potential risk I can think of is making sure to 'clean up' after the download has commenced on the server.
Is there a alternative/.NET Core equivalent for HttpContext.Current and also HttpResponse?
Here is the code that I'd like to make work:
byte[] theData = doc.GetData();
Response.ClearHeaders();
Response.ClearContent();
Response.Expires = -1000;
Response.ContentType = "application/pdf";
Response.AddHeader("content-length", theData.Length.ToString());
Response.AddHeader("content-disposition", "attachment; filename=test.pdf");
Response.BinaryWrite(theData);
HttpContext.Current.ApplicationInstance.CompleteRequest();
Errors I get (non-verbose)
'HttpResponse' does not contain a definition for 'ClearHeaders'
'HttpResponse' does not contain a definition for 'ClearContent'
'HttpResponse' does not contain a definition for 'Expires'
'HttpResponse' does not contain a definition for 'AddHeader'
'HttpResponse' does not contain a definition for 'BinaryWrite'
'HttpContext' does not contain a definition for 'Current'
I've updated this answer to something that actually works!
GetStream does what you need, however to facilitate a file download in .NET Core it would be far easier if you create a controller as described in https://learn.microsoft.com/en-us/aspnet/core/tutorials/first-web-api?view=aspnetcore-2.1.
Then you can create a route controller to serve the file from the stream as shown in Return PDF to the Browser using Asp.net core.
So your controller would look something like:
[Route("api/[controller]")]
public class PDFController : Controller {
// GET: api/<controller>
[HttpGet]
public IActionResult Get() {
using (Doc theDoc = new Doc()) {
theDoc.FontSize = 96;
theDoc.AddText("Hello World");
Response.Headers.Clear();
Response.Headers.Add("content-disposition", "attachment; filename=test.pdf");
return new FileStreamResult(theDoc.GetStream(), "application/pdf");
}
}
}
Out of curiosity I just mocked this up and it does work - serving the PDF direct to the browser as download when you go to the URL localhost:port/api/pdf. If you make the content-disposition "inline; filename=test.pdf" it will show in the browser and be downloadable as test.pdf.
More information on the GetStream method here: https://www.websupergoo.com/helppdfnet/default.htm?page=source%2F5-abcpdf%2Fdoc%2F1-methods%2Fgetstream.htm

how can webflux handle global error, like 404 page not found

i use #restcontrolleradvice and #ExceptionHandler , but i can handle controller exception. server error like 404, 500 can't handle.
#RestControllerAdvice
public class HttpExceptionHandler {
private static final Logger logger = LoggerFactory.getLogger(HttpExceptionHandler.class);
#ExceptionHandler(value = Exception.class)
public String exceptions(Exception e) {
String code = Global.ERR_UNKNOWN;
if (e instanceof MethodNotAllowedException) {
code = Global.ERR_HTTP_METHOD;
}
return code;
}
}
If you're using Spring Boot, this is already done for you and you can customize this support as well quite easily (see Spring Boot reference docs).
If you're using plain Spring Framework, then you need to register a custom WebExceptionHandler bean to handle that (see Spring Framework reference docs). Because those errors can happen at any point during request handling (i.e. not only during the controller handling phase, but also during response encoding, within a WebFilter...), the API there is quite low level and you need to deal with raw DataBuffer instances. If you're looking for inspiration on how to achieve higher level error handling support, you can also take a look at what's done in Spring Boot.

Camel aws-s3 not working

I am trying to create a camel route to transfer a file from an FTP server to an AWS S3 storage.
I have written the following route
private static class MyRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception
{
from("sftp://<<ftp_server_name>>&noop=true&include=<<file_name>>...")
.process(new Processor(){
#Override
public void process(Exchange ex)
{
System.out.println("Hello");
}
})
.to("aws-s3://my-dev-bucket ?
accessKey=ABC***********&secretKey=12abc********+**********");
}
The issue is, this gives me the following exception:
Exception in thread "main" org.apache.camel.FailedToCreateRouteException: Failed to create route route1 at: >>> To[aws-s3://my-dev-bucket?accessKey=ABC*******************&secretKey=123abc******************** <<< in route: Route(route1)[[From[sftp://<<ftp-server>>... because of Failed to resolve endpoint: aws-s3://my-dev-bucket?accessKey=ABC***************&secretKey=123abc************** due to: The request signature we calculated does not match the signature you provided. Check your key and signing method.
I then tried to do this the other way. i.e.writing a method like this:
public void boot() throws Exception {
// create a Main instance
main = new Main();
// enable hangup support so you can press ctrl + c to terminate the JVM
main.enableHangupSupport();
// bind MyBean into the registery
main.bind("foo", new MyBean());
// add routes
AWSCredentials awsCredentials = new BasicAWSCredentials("ABC*****************", "123abc*************************");
AmazonS3 client = new AmazonS3Client(awsCredentials);
//main.bind("client", client);
main.addRouteBuilder(new MyRouteBuilder());
main.run();
}
and invoking using the bound variable #client. This approach does not give any exceptions, but the file transfer does not work.
To make sure that there's nothing wrong with my approach, I tried aws-sqs instead of aws-s3 and that works fine (file succesfully transfers to the SQS queue)
Any idea why this is happening? Is there some basic issue with "aws-s3" connector for camel?
Have you tried of using RAW() function to wrap as like RAW(secretkey or accesskey).
It will help you to pass your keys as it is without encoding.
Any plus signs in you secret key need to be url encoded as %2B, in your case **********+*********** becomes **********%2B***********
When you configure Camel endpoints using URIs then the parameter values gets url encoded by default.
This can be a problem when you want to configure passwords as is.
To do that you can tell Camel to use the raw value, by enclosing the value with RAW(value). See more details at How do I configure endpoints which has an example also.
See Camel Documentation
Your url should looks like:
aws-s3:bucketName?accessKey=RAW(XXXX)&secretKey=RAW(XXXX)

Bad CRC32 in GZIP stream

I am using DevForce 2010 and Silverlight 4.
When saving entities that contain large amount of binary data, I get this error:
Unhandled Error in Silverlight Application The remote server returned an error: NotFound.
When debuging the application I see these errors:
Unhandled Error in Silverlight Application Insufficient memory to continue the execution of the program.
Bad CRC32 in GZIP stream.
I found this thread on Ideablades forum that discusses the problem: http://www.ideablade.com/forum/forum_posts.asp?TID=3361&PN=1&title=bad-crc32-in-gzip-stream
Is this a problem on the server or client?
Is this a problem that has been resolved in any new version of DevForce 2010?
My server has 4 GB memory. Would increasing the memory resolve the problem?
Or what would be the right solution?
Yes, the OnEndpointCreated overrides on both client and server are where you should add the customization. You can add the following to remove GZIP from the binding:
public override void OnEndpointCreated(System.ServiceModel.Description.ServiceEndpoint endpoint)
{
if (endpoint.Binding is CustomBinding)
{
var binding = endpoint.Binding as CustomBinding;
var elements = binding.CreateBindingElements();
// Swap out existing (GZIP) message encoding for binary
var encoding = elements.Find<MessageEncodingBindingElement>();
if (encoding != null)
{
elements.Remove(encoding);
encoding = new BinaryMessageEncodingBindingElement();
elements.Insert(0, encoding);
endpoint.Binding = new CustomBinding(elements);
}
}
}
DevForce will find your classes if they're in an assembly probed on the client/server.
This will turn off compression for everything from your DevForce client to the EntityServer, so may be a bit heavy-handed. You can turn on IIS compression to compress data sent to the client to help.
There haven't been any changes to GZIP processing since the 6.1.7 release of DevForce 2010. That thread still contains the best information of how to work around the problem: 1) modify the save logic or your entity definition to reduce the amount of data saved; 2) turn off use of GZIP; or 3) write a custom message encoder with another compression library.
Thank you Kim Johnson,
I have looked at the samples and I feel uncomfortable adding those config files and maybe breaking something that works fine today.
If I go the code-way, will I be ably to switch off GZIP and still retain the rest of the default settings for DevForce?
I guess the code below is what I should go for?
If I save these classes on the client and server, will DevForce automatically find these classes?
//Client
using System.ServiceModel.Channels;
using IdeaBlade.Core.Wcf.Extensions;
public class ProxyEvents : IdeaBlade.EntityModel.ServiceProxyEvents {
public override void OnEndpointCreated(System.ServiceModel.Description.ServiceEndpoint endpoint) {
base.OnEndpointCreated(endpoint);
// My client code turning GZIP off comes here?
}
public override void OnFactoryCreated(System.ServiceModel.ChannelFactory factory) {
base.OnFactoryCreated(factory);
}
}
//Server
using System.ServiceModel.Channels;
using IdeaBlade.Core.Wcf.Extensions;
public class ServiceEvents : IdeaBlade.EntityModel.Server.ServiceHostEvents {
public override void OnEndpointCreated(System.ServiceModel.Description.ServiceEndpoint endpoint) {
base.OnEndpointCreated(endpoint);
// My server code turning GZIP off comes here?
}
public override void OnServiceHostCreated(System.ServiceModel.ServiceHost host) {
base.OnServiceHostCreated(host);
}
}