IBM Worklight v5.0.6 Application Center - apk file upload fails - ibm-mobilefirst

When attempting to upload our apk file, the server responds back with simply
"File HelloWorld.apk file not uploaded"
Nothing is logged in trace.log in relation to this upload, so not able to see any type of log message to diagnose further. How do you enable logging for this?
Is there a timeout, or file upload size limit? If so, how/where do you change that? The HelloWorld.apk file size is 5.6MB

There is indeed a filesize limit, but it is imposed by MySQL by default (1MB). If you are using MySQL 5.1 or 5.5 (5.6 is not supported in Worklight 5.0.x). follow these steps:
Locate the file my.ini belonging to your MySQL installation
In it, find the section [mysqld]
Underneath the section name, paste this: max_allowed_packet=1000M
Re-start the MySQL service
Re-deploy the .apk file
You may need to re-start the application server running Application Center as well.

Related

Can't upload or see DEPLOY button getting highlighted - can't upload artifact .tar .tgz

Artifactory Version: EnterpriseX license 7.15.3
I see I have valid access to deploy artifacts in all the repositories where I'm trying to upload an artifact using Artifactory UI.
I go to the repository, see, under effective permissions, that my userid is listed with all permissions checked-in including upto Manage level.
When I try to upload a small .docx/.tar/.tgz file, it works fine.
When I'm trying to upload a large .tgz file 10+GB (in size) (to be used in an air-gapped environment tar ball), then I'm not seeing the DEPLOY button highlighted so that I can click on it. It's still kind of greyed out (Greenish), see picture. It seems like after you select a file to upload, it does an animation of 1%--->reaching--->100% and then after some time, shows you 2 check boxes for deploying as a deploy bundle and if you want to use a user-defined Layout format, but with the 10+GB tar file, when it reaches 100%, I don't see these 2 check-boxes in UI showing up and also the DEPLOY Green button is not highlighted so that I can click on it. I see a RED circle with a cross (which indicates, something is still not ready). Why it's even showing me 100% then for pre-upload/processing then?
Am I missing anything? I have already fixed the file size limit that I was facing for 100MB default file size limit in Artifactory repo.
PS: Related post when trying at command line level gives me 502 Bad Gateway and 403 errors: Artifactory Curl -X PUT large file - 502 Bad Gateway The proxy server received an invalid response from an upstream server / 403 Bad props auth token

Guacamole SFTP not working for larger files

I am using guacamole to connect to remote devices over RDP for Windows machines and SSH for Linux. Now I would like to enable SFTP support for the connections so I enabled the option 'Enable SFTP' in the guacamole connection settings.
The problem is SFTP is working for smaller files (<3KB), creates 0KB files for slightly larger files (3KB-150KB) and raises internal error for larger files (>150KB). I checked for what file size SFTP is failing by trial, transferring files of different sizes to the remote machine.
In the screenshot, it can be seen that 'attendance.py' a smaller file of size 548 bytes is successfully transferred to the tmp folder in the Linux machine, but the other two files files are created as empty files. The pdf file I tried to move is close to 180KB, which raises a Internal Error. I checked if there is some dependency with this error and filetype but this problem occurs for all file formats. I have the same problem when transferring file to a windows machine configured with RDP protocol in the same guacamole server.
Can someone help me with this? Thanks in advance
Are you using a reverse Proxy?
I had the same problem by using nginx. It seems it is by default not allowing files greater than 1MB.
I could change that at nginx to any size and now it works.
For nginx look for: client_max_body_size
If you are not using nginx, i would take a look at the webserver config. Remember, you using some sort of a webserver and a filelimit is there usualy very much needed.

File Share is keep on loading not showing the files in Azure Storage Explorer

in Azure Storage Explorer i connected to file share through shared access signature(SAS) URI method. after connected, no files are showing under File Shares folder as shown in the image it is keep on loading.
and popping the error message as below after waiting for long time
i'm using Windows 7, Azure Storage Explorer version : 1.10.1, and i have .net 4.0 Framework installed.
Thanks.
This issue may occur due to several reasons like Network issue/ Proxy/subscription/updates/Permissions.
There a few reasons you may be seeing this error: Firstly, I would suggest to try the troubleshooting steps mentioned here:
Delete data from "%appData/StorageExplorer" folder or entire folder from your machine. After deleting when you launch storage explorer you will be prompted to re-enter your credentials.
The uninstall process does not remove all of the files in the local storage, and so I found that on Windows at least if I uninstall MASE and remove the folders that are in C:\Users[username]\AppData\Roaming\Microsoft Azure Storage Explorer and reinstall,
If you are connected to Azure through a proxy, verify that your proxy settings are correct. If you were granted access to a resource from the owner of the subscription or account, verify that you have read or list permissions for that resource.
Connection String Does Not Have Complete Configuration Settings
Refer the following Storage Explorer troubleshooting documentation and let us know if you need further assistance: Unable to Retrieve Children
If the issue still persist un-install and reinstall the latest version 1.11.2
It would also be worth checking if port 445 is open, since File shares are SMB based, port 445 has to be open. Several Internet service providers block it, so it's also worth testing whether it is open and you can connect to it. you can use the following tool to test it: https://gallery.technet.microsoft.com/Troubleshooting-tool-for-a9fa1fe5

How rotate all logs in glassfish?

I can push rotate button from "Server" menu item, but it rotates only server.log file, while others files int logs folder are not touched. Is there a way to rotate all?
This is the expected behaviour. The current log file is rotated/archieved and a new log is created. The old logs are kept to look them up later. From the official Oracle documentation:
Logs are rotated automatically based on settings in the
logging.properties file. You can change these settings by using the
Administration Console.
You can rotate the server log file manually by using the rotate-log
subcommand in remote mode.
This example moves the server.log file to
yyyy-mm-dd_server.log and creates a new server.log file in the default
location.
If you want to restrict the number of log files to keep you can try to set the system property com.sun.enterprise.server.logging.max_history_files, which specifies the maximum number of log files to keep (more info here).

What is the fastest way to upload the big files to the server

I have got dedicated server and file about 4 GB to upload on the server. What is the fastest and most save way to upload that file to the server?
FTP may create issues if the connection will be broken.
SFTP will have the same issue as well.
Do you have your own computer available through internet public IP as well?
In that case you may try to set up a simple HTTP server (if you have Windows - just set up the IIS) and then use some download manager on dedicated server (depends from OS) to download the file through HTTP (it can use multiple streams for that) or do this through torrent.
There're trackers, like http://openbittorrent.com/, which will allow you to keep the file on your computer and then use some torrent client to upload the file to the dedicated server.
I'm not sure what OS your remote server is running but I would use wget it has a --continue from the man page:
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.
wget binaries are available for GNU/Linux / Windows / MacOSX / dos:
http://wget.addictivecode.org/FrequentlyAskedQuestions?action=show&redirect=Faq#download