I would like to decompile a JavaScript app compiled with NW.js, trasformed to .exe
I have nw.exe, nw.pak, index.html, package.json, resource dir, and some other folders.
Launching nw.exe it runs perfectly, but I would like to modify it, and I need the source code.
Does exist a way to get JavaScript sources back from the exe?
there are many many ways to package an NW.js app. They're all documented on the website. but here are some easy things you can try.
Unlikely, but possible: Rename the .exe to .zip and open it. there may be project files. Most apps aren't packaged this way because it makes every launch of the app slow, since it has to unzip the contents on every opening and copy them to a temp folder. Plus the zip would contain your package.json which you already have, so this likely is not related to your case.
the package.json should be the entry point, I assume the index.html is what it will point to for the "main" value inside that file. The index.html will then pull in all the other files. Some of these may be online on a server, meaning you can't edit them locally. Some may be in a local folder, probably the "resources" folder, since that is not something that ships with NW.js by default.
It is possible that a light version of the app is shipped and the core contents are downloaded/replaced as it is updated. If I were doing this, I would store it in the appData folder. You can find this location by doing console.log(nw.App.dataPath). It is a user account specific folder that will be in a folder that users the App's name as defined in the package.json. Though this is unlikely, it is possible.
It is possible to "protect" the original source code by using V8 snapshots. If this was used, you will not be able to recover the original source code. To date, there are no recorded cases of someone reverse engineering a V8 snapshot. Though it is possible, it would require an extremely high technical skill level and knowledge across several disciplines.
The source code you find may also be "uglified". Meaning, though it would all be there, it would be completely obscured, making it pretty useless to read. Instead of
function getUserData (url) {
axios.get(url)
.then(function (response) {
return response.data;
})
}
it would be something like
function a(b){c.d(b).then(e=>e.f)}
not super obvious what is going on when uglified.
Related
I have developed a vue application and did run npm run build
After that I uploaded the content in the dist file to my webpage but it returned a blank page.
Since I did this for testing I uploaded it to a folder in my public_html/mypage.com/vueapplication To get all the paths right I added a vue.config.js with this content:
// vue.config.js
module.exports = {
publicPath: '/vueapplication/'
}
The application now works but I wounder however:
how do I best publish/upload the application to my site? Just by simply dragging the content inte the right folder?
how can I best maintain my site? Do I need to build again and upload, overwriting my files when everytime I make an update on my site?
And what is the difference between build and deploy your application?
Drag and dropping your code should work. But as your app grows you may want to look into automating this. For instance if you use an S3 bucket you can use the aws cli to automate the upload.
Yes, you should overwrite your deploy folder(s). You need to also take care of deploying different binary files, that have the same name. An example is if you have a global css file (main.css for instance). The file will probably change content between deployments, but keep the same name. Browsers may cache the file so users that downloaded older versions of the file will not use the new one. There are different techniques to handle this, but if you use webpack, it uses cache busting techniques and you should be fine.
Build is the process of transforming source code into an artifact(s). Exactly what this means differs from language to language, platform to platform. In the vuejs world this usually means a couple of js files, a couple of css files and some assets.
Deploying means taking the output of a build and making it available to your users. Again this differs from project to project. In the vuejs world this usually means taking the artifacts from the build and uploading them to an http enabled web server.
I'm writing a react-native app, and I want it to deploy with a zip file that contains a device firmware update.
Before letting the user send the update, I need my code to open the zip and do some validation of its contents.
I've found lots of zip-handling NPM packages, so all I need to do is load the file contents so I can feed it to one of these.
require('./firmware/fw.zip'); <-- packager doesn't include .zip by default
require('./firmware/fw.pdf'); <-- [gross hack] packager includes pdfs, but the actual result of the require() call is a number: 5. I don't know what I can do with this number to get file contents, but I'm pretty sure this require() system is designed for loading images, not binary data.
ReactNativeFs.openFile('./firmware/fw.zip'); <-- fails with ENOENT
ReactNativeFs.openFile(${ReactNativeFs.MainBundlePath}/firmware/fw.zip); <-- MainBundlePath is undefined on android.
This seems like a really basic question, so I'm sure I've missed a piece of documentation somewhere, but I'm heading into my third hour trying to load the contents of this file with no luck.
I'm pretty sure I could manually put the zip file into the appropriate android and ios resource directories, but that seems like a step down a hard-to-maintain road.
I encountered this problem again a couple months later (I'm apparently the only guy that needs to package .zips in react-native), and the above answer didn't work out for iOS. So I encoded the .zips as base64, put them in .js files, then used import to get the data from those .js files. This actually seems like a somewhat hacky but also flexible long-term solution, without having to mess around with platform-dependent file locations.
See whole answer at my new question: React-native packager configuration - How to include .zip file in bundle?
Partial solution:
Modify android/app/build.gradle, and add
task copyData(type: Copy) {
from '../../firmware/fw.zip'
into 'src/main/assets/raw/firmware'
}
preBuild.dependsOn copyData
This will at least ensure that the file gets copied each time you build, and is then available with ReactNativeFs.readFileAssets('raw/firmware/fw.zip', 'base64'). I'm not entirely thrilled because I still have to have iOS/android dependent code when loading the file, but at least it's loading now.
Tip: watch out for your syntax in gradle. into src/main/assets/myFirmware.zip will create a DIRECTORY called myFirmware.zip, and put your zip file underneath it. Then readFileAssets will still fail because it's finding a directory at your path, not a file.
I want to set up my HelloWorld intellij-erlang project with all files in the same directory so I can easily switch between IDEA and emacs/vim.
<my-project>/hello.erl
<my-project>/hello.beam
Now, if I configure the output directory to be the same as the source, hello.erl gets emptied as part of the build and the compilation fails.
I assumed it's something to do with copying resources to the output directory, so I've configured intellij-erlang to exclude *.erl from the resources with a !?*.erl pattern, but this does not have any effect, hello.erl still gets emptied before the compilation takes place.
As an experiment, I've also tried using separate src and out directories, and intellij-erlang always copies the *.erl to out irrespective of the resource patterns.
Based on all this, I would conclude that intellij-erlang cannot work with all files in the same directory. Have I missed anything?
Erlang programs should be build on standard OTP directory structure. Build tools like rebar (used by intellij) or erlang.mk build on this conventions. And so should all IDE's.
Intellij does it, just like you notice. And so does Emacs's plugin (that I use and can confirm). I would guess so does Vim's one.
So if you would like to be able to switch easily between your IDE's you should try to keep to this convetion of keeping you source files in src and compiled files in bin (and headers in include).
I have a few sketches I'd like to distribute together. All of them use a custom library which resides in the same folder. The current directory structure is totally flat. All .ino files are in a single folder, right next to the .cpp and .h files for the library. This makes it easy to distribute and update.
This would work perfectly, except that each time I open one of the sketches to upload, the Arduino IDE forces me to move it into a subfolder, then it can't find the custom library. Is there any way to disable this behavior, or can anyone suggest a workaround? Thanks!
I tried at first to do all flat and found it never ending battle. Rather than always working around that, I work with it. My example.
Where I have my local repo in some arbitrary location, then have symbolic links in the ./arduino/library/. directory pointing to them appropriate directories in the repo. In the example I have symbolic links for both SdFat and SFEMP3shield in the ./library/. directory. I use windows so rather than links (or the ln -s command) I use "hard junctions".
Note the libraries use a directory structure of ./ardunio/library/foo/example/bar/bar.ino. So I actually do all my projects in the ./example/bar/bar.ino sketch. Also its worth noting that I use an external editor (like notepad++).
This way my repo can have more or less of what I specify it to.
I checked quite a few similar questions, but so far I am unsatisfied with the solutions.
Ever use the Minecraft Server? At initial launch, it creates all the files and folders it needs, and allows you to make changes to files like Server.properties and ops.txt by making them external of the executable jar file.
I'm working on a similar project, and I want to duplicate that behavior. Everything works great when I run it in eclipse. When I export to a jar file though, things get funky. The external files and folders are created without a hitch, but afterword, it would appear as though they cannot be read from or written to. Any ideas how Notch made his server?
--edit--
Scratch that, it doesn't even appear to reliably create the files and folders. Maybe it only creates them the very first run after creation?
--edit again--
It creates them in the root directory. When I tested it in eclipse, the root directory was limited to the folder containing the project, and therefore looked fine. The solution was to make the class aware of it's location, and include it in all file operations.
Have the main class in your executable jar file look up where it is, then have it store that information in a global String or something. Prefix your filenames with that string in your file operations, and voila! It's writing to the correct directory.