I am trying to use Redis connector in Mule, but when I start the application it gives the below error
org.xml.sax.SAXParseException: schema_reference.4: Failed to read schema document 'http://www.mulesoft.org/schema/mule/redis/current/mule-redis.xsd', because 1) could not find the document; 2) the document could not be read; 3) the root element of the document is not <xsd:schema>
I checked the URL, and it is giving 404 error. I tried replacing current with 3.2, but no luck.
Do anybody have idea, how can we make it running ?
Related
I have developed an application in mule3 to transform data and then upload the data as a file to sftp location. I have included all common errors, such as http 400 series and 500 but what is a proper handling status code for when ftp fails, for example with file upload, connection or permission.
I have searched a lot on the internet and the more I search the more I get lost.
Does anyone have experience with this?
Thanks
If you are asking for a table for mapping error codes between SFTP and HTTP, there is no standard for it. These are completely different protocols. You have to define your own mapping. Most of them will probably be 5xx in HTTP, with authentication errors probably 403.
Not sure which connector version you use. But if you open the documentation of the SFTP connector, like: https://docs.mulesoft.com/sftp-connector/1.4/sftp-documentation.
You can see the documentation refers to the error that could be thrown, for example the copy operation can throw the following errors.
Based on those errors you should do your logic. Also the HTTP connector is throwing such errors, but then in the HTTP namespace. If needed you can also remap errors to a different and new namespace. Based on your remapped errors you could also implement logic.
All of a sudden we are seeing this random error / exception in our web application.
Failed to load resource: the server responded with a status of 502 (Bad Gateway).
In the Log Stream, we are seeing the following details, with specific error code as 502.3 - Bad Gateway: Forwarder Connection Error (ARR).
Also, sometimes in the browser itself we see "The CGI application did not return a valid set of HTTP errors." getting displayed.
Most of the searches for these error codes refer to "IIS / Proxy Server" configuration. But, we haven't changed any such settings.
The error happens very randomly and not specific to any user action/function. Same functionality works first and on second execution immediately after first one throws this error.
How to figure out what is causing this and how to fix?
I google this question, because the program was normal at the beginning, and the subsequent 502.3 error. After I checked the information on the Internet, I feel that it can only give us an inspiration, and it cannot solve your problem immediately.
So my suggestion is that first you browse post1 and post2 I provided.
Next, proceed to Troubleshoot according to the steps of the official documentation. Specific errors require specific analysis.
I am trying to do multiple file upload simultaneously to google big-query using command line tool. I got following error :
BigQuery error in load operation: Could not connect with BigQuery server.
Http response status: 503
Http response content:
Service Unavailable
Any way to workaround this problem ?
How do I upload multiple files simultaneously to google big-query using command line tool.
Multiple file upload should work (and we use it every day). If you're getting a 503, that indicates something is wrong with the service. One thing you might want to make sure of is that if you're using a * in your command line that you have it quoted so that the shell doesn't expand it automatically before it gets passed to bq.
If you're getting a 503 error, can you retry the command the flag --apilog=- (this needs to be one of the first params) which will dump the interaction with the server to stdout. The problem may be obvious from that log, but if it isn't can you update your question with the relevant portions of the log? If you're not comfortable posting that information on a public forum, can you e-mail it to me at tigani at google dot com?
When following the instructions on http://developer.gooddata.com/article/loading-data-via-api, I always get a HTTP400 error:
400: Neither expected file "upload_info.json" nor archive "upload.zip" found (is accessible) in ""
When I HTTP GET the same path that I did for the HTTP PUT, the file downloads just fine.
Any pointers to what I'm probably doing wrong?
GoodData is going trough migration from AWS to RackSpace.
Try to change of all get/post/put requests:
secure.gooddata.com to na1.secure.gooddata.com
secure-di.gooddata.com to na1-di.gooddata.com
You can check the datacenter where the project is located via /gdc/projects/{projectId} resource - the "project.content.cluster" field.
For example:
https://secure.gooddata.com/gdc/projects/myProjectId:
{
"project" : {
"content" : {
"cluster" : "na1",
....
For AWS this field has an empty value, "na1" means rackspace.
I'm trying to index a site with "Apache Nutch 1.4" and when I run the command below, the following error occurs "java.io.IOException: Job failed"
bin/nutch solrindex http://localhost:8983/solr/ crawl/crawldb -linkdb crawl/linkdb crawl/segments/*
I installed "Tomca6" and "Apache Solr 3.5.0" to work with Nutch but unfortunately is not working
simulation
root#debian:/usr/share/nutch/runtime/local$ bin/nutch solrindex http://localhost:8983/solr/ crawl/crawldb -linkdb crawl/linkdb crawl/segments/*
SolrIndexer: starting at 2012-03-28 18:45:25
Adding 48 documents
java.io.IOException: Job failed!
root#debian:/usr/share/nutch/runtime/local$
Can someone help me please?
This error often occurs if the mapping of nutch result fields onto Solr field is incorrect or incomplete. This results in the "update" action being rejected by the Solr server. Unfortunately, at some point in the call chain this error is converted into a "IO error" which is a little misleading. My recommendation is to access the web console of the Solr server (which is accessible using the same URL as for the submissing of links, e.g. in this case http://some.solr.server:8983/solr/) and go to to the logging tab. Errors concerning the mapping will show up there!
Looks like Solr is not configured right. (Please ensure that the input linkdb, crawldb and segments are present in the location that you pass command line).
Read
Setting up Solr 1.4 with Apache Tomcat 6.X
Nutch 1.3 and Solr Integration .