How to load Magritte Seaside? - smalltalk

I've loaded Magritte and Seaside from the configuration browser into Pharo 4, but I don't see that the package Magritte-Seaside was loaded.
How do I load this package?

I highly recommend you loading the Stephan's QCMagritte package which includes the correct directives to load Seaside 3 with a Zinc adaptor so you can start a web server without loading anything else:
From MinGW command line:
$ wget -O- http://get.pharo.org/40+vm | bash
$ ./pharo-vm/Pharo.exe Pharo.image config \
"http://smalltalkhub.com/mc/Pharo/MetaRepoForPharo40" \
"ConfigurationOfQCMagritte" --printVersion --install=stable --group=All
Create an adaptor, start a web server with the Seaside Control Panel on port 8080, and then point your browser to http://localhost:8080/browse to see applications

The configuration browser only loads default groups for the configurations it loads. In the ConfigurationOfSeaside and ConfigurationOfMagritte you'll find many more groups.
In the ConfigurationOfQCMagritte I use 'Seaside' from Magritte and #('JQueryUI' 'JQuery-JSON') from Seaside. If you don't mind the extra packages, you could just load QCMagritte from the configuration browser.
To just add the missing packages, you could load the latest Magritte-Seaside and Magritte-Pharo-Seaside packages from the Magritte3 smalltalkhub repository with the Monticello Browser.
A pre-loaded QCMagritte image is available from http:ci.inria.fr/pharo-contribution/job/QCMagritte

I saw the group Seaside defined as a Metacello group in the configuration's baseline for 3.3 (which is used by 3.5, the current version). So I was able to load the package by evaluating:
(ConfigurationOfMagritte3 project version: #stable) load: 'Seaside'.

Related

Phantomjs won't install: Autoconfiguration Error

When trying to install Phantomjs on Ubuntu 22.04, I get the following error:
There are some Q.& A's here from 2015-2022, I tried all of the included suggestions.
https://github.com/ariya/phantomjs/issues/10904
https://gist.github.com/julionc/7476620
wkhtmltopdf - libfontconfig.so.1: cannot open shared object file
http://ubuntuhowtoo.blogspot.com/2019/05/linux-nodejs-phantomjs-error-loading.html
Auto configuration failed
139998593603520:error:25066067:DSO support routines:DLFCN_LOAD:could not load the shared library:dso_dlfcn.c:185:filename(libproviders.so): libproviders.so: cannot open shared object file: No such file or directory
139998593603520:error:25070067:DSO support routines:DSO_load:could not load the shared library:dso_lib.c:244:
139998593603520:error:0E07506E:configuration file routines:MODULE_LOAD_DSO:error loading dso:conf_mod.c:285:module=providers, path=providers
139998593603520:error:0E076071:configuration file routines:MODULE_RUN:unknown module name:conf_mod.c:222:module=providers```
I had the same issue and fixed it by setting export OPENSSL_CONF=/dev/null prior to phantomJS usage. If it's not possible for you to disable openssl then you would have to pack you application with older OpenSSL version.
Explanation:
Ubuntu 22.04 uses the new OpenSSL version 3.0.2 instead of the older OpenSSL version 1.1.1 . These OpenSSL versions are not fully compatible, so this is why you see this error when PhantomJS tries to auto configure the SSL/TLS settings.
Above answer is little about as I am describing.
Step:1 First stop your node server then run this command in base directory.
Step:2 I suggest you to comment out the lines providers = provider_sectin in the
file/etc/ssl/openssl.cnf
Step:3 Start your application
Brief ref. https://github.com/nodejs/node/issues/43132

Installing Google Adwords Api Library (using docker)

Googles documentation on installing the library, found here: https://github.com/googleads/googleads-php-lib/blob/master/README.md#getting-started, instructs us to copy adsapi_php.ini, as constructed here: https://github.com/googleads/googleads-php-lib/blob/master/examples/AdWords/adsapi_php.ini, to your home directory.
I filled out the necessary variables in the .ini, and I am using docker so I have placed this file inside my container at /var/www/home/node/ and when I run the command composer require googleads/googleads-php-lib I am given the following error in the command prompt:
Your requirements could not be resolved to an installable set of packages.
Problem 1
- Installation request for googleads/googleads-php-lib ^37.1 -> satisfiable by googleads/googleads-php-lib[37.1.0].
- googleads/googleads-php-lib 37.1.0 requires ext-soap * -> the requested PHP extension soap is missing from your system.
To enable extensions, verify that they are enabled in your .ini files:
- /usr/local/etc/php/php.ini
- /usr/local/etc/php/conf.d/adsapi_php.ini
- /usr/local/etc/php/conf.d/docker-php-ext-pdo_pgsql.ini
- /usr/local/etc/php/conf.d/docker-php-ext-sodium.ini
- /usr/local/etc/php/conf.d/docker-php-ext-xdebug.ini
You can also run `php --ini` inside terminal to see which files are used by PHP in CLI mode.
Installation failed, reverting ./composer.json to its original content.
I assumed the issue was my adsapi_php.ini was simply in the wrong location as it contains what I believe is necessary to avoid the above issue, but I have tried placing it in several different places and yet I always get the same error.
Any help would be appreciated!
Just try to edit php.ini inside docker (docker exec -t {container} bashand enable there the extenstion soap

Create local debian repository

My goal is to demonstrate creating a local debian repository with controlled versions of tools used (e.g. compiler versions) to make a build system more predictable.
I've tried to follow this example: http://linuxconfig.org/easy-way-to-create-a-debian-package-and-local-package-repository
but when I get to the apt-get update stage, I always get a 404 not found on the repository I've added.
The apache2 server is running, I can view the default page installed at http://localhost/html/index.html.
I am trying this with the file fortune-mod_1%3a1.99.1-7_amd64.deb installed to /var/www/debs. I create the Packages.gz file as the tutorial suggests:
dpkg-scanpackages debs /dev/null | gzip -9c > debs/Packages.gz
I also add a new file: /etc/apt/sources.list.d/myppa.list with this line:
deb http://localhost debs/
I restart the apache2 service just in case:
sudo service apache2 restart
but running:
sudo apt-get update
still produces this error:
W: Failed to fetch http://localhost/debs/Packages 404 Not Found
Is there something basic I'm missing? Ultimately, I'd like to get this working over a LAN, but first have to get it working on a single machine.
EDIT: I'm doing this on Ubuntu 14.04.
EDIT: Show contents of file /etc/apt/sources.list.d/myppa.list
tldr; use aptly
It's the easiest apt repository management tool I've found and it comes with neat tutorial showing how to create, populate, and publish your own apt repository.
References:
https://www.aptly.info/
https://www.aptly.info/tutorial/repo/
I ended up solving the problem. It was an issue with the default document root being different for the tutorial than on my system. All I did was move my debs folder to html (document root turns out to be /var/www/html, not just /var/www on my install). That did the trick.

Adding LESS file to HTML [duplicate]

I'm trying to load a 3D model, stored locally on my computer, into Three.js with JSONLoader, and that 3D model is in the same directory as the entire website.
I'm getting the "Cross origin requests are only supported for HTTP." error, but I don't know what's causing it nor how to fix it.
My crystal ball says that you are loading the model using either file:// or C:/, which stays true to the error message as they are not http://
So you can either install a webserver in your local PC or upload the model somewhere else and use jsonp and change the url to http://example.com/path/to/model
Origin is defined in RFC-6454 as
...they have the same
scheme, host, and port. (See Section 4 for full details.)
So even though your file originates from the same host (localhost), but as long as the scheme is different (http / file), they are treated as different origin.
Just to be explicit - Yes, the error is saying you cannot point your browser directly at file://some/path/some.html
Here are some options to quickly spin up a local web server to let your browser render local files
Python 2
If you have Python installed...
Change directory into the folder where your file some.html or file(s) exist using the command cd /path/to/your/folder
Start up a Python web server using the command python -m SimpleHTTPServer
This will start a web server to host your entire directory listing at http://localhost:8000
You can use a custom port python -m SimpleHTTPServer 9000 giving you link: http://localhost:9000
This approach is built in to any Python installation.
Python 3
Do the same steps, but use the following command instead python3 -m http.server
VSCode
If you are using Visual Studio Code you can install the Live Server extension which provides a local web server enviroment.
Node.js
Alternatively, if you demand a more responsive setup and already use nodejs...
Install http-server by typing npm install -g http-server
Change into your working directory, where yoursome.html lives
Start your http server by issuing http-server -c-1
This spins up a Node.js httpd which serves the files in your directory as static files accessible from http://localhost:8080
Ruby
If your preferred language is Ruby ... the Ruby Gods say this works as well:
ruby -run -e httpd . -p 8080
PHP
Of course PHP also has its solution.
php -S localhost:8000
In Chrome you can use this flag:
--allow-file-access-from-files
Read more here.
Ran in to this today.
I wrote some code that looked like this:
app.controller('ctrlr', function($scope, $http){
$http.get('localhost:3000').success(function(data) {
$scope.stuff = data;
});
});
...but it should've looked like this:
app.controller('ctrlr', function($scope, $http){
$http.get('http://localhost:3000').success(function(data) {
$scope.stuff = data;
});
});
The only difference was the lack of http:// in the second snippet of code.
Just wanted to put that out there in case there are others with a similar issue.
Just change the url to http://localhost instead of localhost. If you open the html file from local, you should create a local server to serve that html file, the simplest way is using Web Server for Chrome. That will fix the issue.
I'm going to list 3 different approaches to solve this issue:
Using a very lightweight npm package: Install live-server using npm install -g live-server. Then, go to that directory open the terminal and type live-server and hit enter, page will be served at localhost:8080. BONUS: It also supports hot reloading by default.
Using a lightweight Google Chrome app developed by Google: Install the app, then go to the apps tab in Chrome and open the app. In the app point it to the right folder. Your page will be served!
Modifying Chrome shortcut in windows: Create a Chrome browser's shortcut. Right-click on the icon and open properties. In properties, edit target to "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" --disable-web-security --user-data-dir="C:/ChromeDevSession" and save. Then using Chrome open the page using ctrl+o. NOTE: Do NOT use this shortcut for regular browsing.
Note: Use http:// like http://localhost:8080 in case you face error.
Use http:// or https:// to create url
error: localhost:8080
solution: http://localhost:8080
In an Android app — for example, to allow JavaScript to have access to assets via file:///android_asset/ — use setAllowFileAccessFromFileURLs(true) on the WebSettings that you get from calling getSettings() on the WebView.
fastest way for me was:
for windows users run your file on Firefox problem solved, or
if you want to use chrome easiest way for me was to install Python 3 then from command prompt run command python -m http.server then go to http://localhost:8000/ then navigate to your files
python -m http.server
Easy solution for whom using VS Code
I've been getting this error for a while. Most of the answers works. But I found a different solution. If you don't want to deal with node.js or any other solution in here and you are working with an HTML file (calling functions from another js file or fetch json api's) try to use Live Server extension.
It allows you to open a live server easily. And because of it creates localhost server, the problem is resolving. You can simply start the localhost by open a HTML file and right-click on the editor and click on Open with Live Server.
It basically load the files using http://localhost/index.html instead of using file://....
EDIT
It is not necessary to have a .html file. You can start the Live Server with shortcuts.
Hit (alt+L, alt+O) to Open the Server and (alt+L, alt+C) to Stop the server. [On MAC, cmd+L, cmd+O and cmd+L, cmd+C]
Hope it will help someone :)
If you use old version of Mozilla Firefox (pre-2019), it will work as expected without any issues;
P.S. Surprisingly, old versions of Internet Explorer & Edge work absolutely fine too.
For those on Windows without Python or Node.js, there is still a lightweight solution: Mongoose.
All you do is drag the executable to wherever the root of the server should be, and run it. An icon will appear in the taskbar and it'll navigate to the server in the default browser.
Also, Z-WAMP is a 100% portable WAMP that runs in a single folder, it's awesome. That's an option if you need a quick PHP and MySQL server. Though it hasn't been updated since 2013. A modern alternative would be Laragon or WinNMP. I haven't tested them, but they are portable and worth mentioning.
Also, if you only want the absolute basics (HTML+JS), here's a tiny PowerShell script that doesn't need anything to be installed or downloaded:
$Srv = New-Object Net.HttpListener;
$Srv.Prefixes.Add("http://localhost:8080/");
$Srv.Start();
Start-Process "http://localhost:8080/index.html";
While($Srv.IsListening) {
$Ctx = $Srv.GetContext();
$Buf = [System.IO.File]::OpenRead((Join-Path $Pwd($Ctx.Request.RawUrl)));
$Ctx.Response.ContentLength64 = $Buf.Length;
$Ctx.Response.Headers.Add("Content-Type", "text/html");
$Buf.CopyTo($Ctx.Response.OutputStream);
$Buf.Close();
$Ctx.Response.Close();
};
This method is very barebones, it cannot show directories or other fancy stuff. But it handles these CORS errors just fine.
Save the script as server.ps1 and run in the root of your project. It will launch index.html in the directory it is placed in.
I suspect it's already mentioned in some of the answers, but I'll slightly modify this to have complete working answer (easier to find and use).
Go to: https://nodejs.org/en/download/. Install nodejs.
Install http-server by running command from command prompt npm install -g http-server.
Change into your working directory, where index.html/yoursome.html resides.
Start your http server by running command http-server -c-1
Open web browser to http://localhost:8080
or http://localhost:8080/yoursome.html - depending on your html filename.
I was getting this exact error when loading an HTML file on the browser that was using a json file from the local directory. In my case, I was able to solve this by creating a simple node server that allowed to server static content. I left the code for this at this other answer.
It simply says that the application should be run on a web server. I had the same problem with chrome, I started tomcat and moved my application there, and it worked.
I suggest you use a mini-server to run these kind of applications on localhost (if you are not using some inbuilt server).
Here's one that is very simple to setup and run:
https://www.npmjs.com/package/tiny-server
Experienced this when I downloaded a page for offline view.
I just had to remove the integrity="*****" and crossorigin="anonymous" attributes from all <link> and <script> tags
If you insist on running the .html file locally and not serving it with a webserver, you can prevent those cross origin requests from happening in the first place by making the problematic resources available inline.
I had this problem when trying to to serve .js files through file://. My solution was to update my build script to replace <script src="..."> tags with <script>...</script>.
Here's a gulp approach for doing that:
1.
run npm install --save-dev to packages gulp, gulp-inline and del.
2.
After creating a gulpfile.js to the root directory, add the following code (just change the file paths for whatever suits you):
let gulp = require('gulp');
let inline = require('gulp-inline');
let del = require('del');
gulp.task('inline', function (done) {
gulp.src('dist/index.html')
.pipe(inline({
base: 'dist/',
disabledTypes: 'css, svg, img'
}))
.pipe(gulp.dest('dist/').on('finish', function(){
done()
}));
});
gulp.task('clean', function (done) {
del(['dist/*.js'])
done()
});
gulp.task('bundle-for-local', gulp.series('inline', 'clean'))
Either run gulp bundle-for-local or update your build script to run it automatically.
You can see the detailed problem and solution for my case here.
For all y'all on MacOS... setup a simple LaunchAgent to enable these glamorous capabilities in your own copy of Chrome...
Save a plist, named whatever (launch.chrome.dev.mode.plist, for example) in ~/Library/LaunchAgents with similar content to...
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>launch.chrome.dev.mode</string>
<key>ProgramArguments</key>
<array>
<string>/Applications/Google Chrome.app/Contents/MacOS/Google Chrome</string>
<string>-allow-file-access-from-files</string>
</array>
<key>RunAtLoad</key>
<true/>
</dict>
</plist>
It should launch at startup.. but you can force it to do so at any time with the terminal command
launchctl load -w ~/Library/LaunchAgents/launch.chrome.dev.mode.plist
TADA! 😎 💁🏻 🙊 🙏🏾
Not possible to load static local files(eg:svg) without server. If you have NPM /YARN installed in your machine, you can setup simple http server using "http-server"
npm install http-server -g
http-server [path] [options]
Or open terminal in that project folder and type "hs". It will automaticaly start HTTP live server.
er. I just found some official words "Attempting to load unbuilt, remote AMD modules that use the dojo/text plugin will fail due to cross-origin security restrictions. (Built versions of AMD modules are unaffected because the calls to dojo/text are eliminated by the build system.)" https://dojotoolkit.org/documentation/tutorials/1.10/cdn/
One way it worked loading local files is using them with in the project folder instead of outside your project folder. Create one folder under your project example files similar to the way we create for images and replace the section where using complete local path other than project path and use relative url of file under project folder .
It worked for me
Install local webserver for java e.g Tomcat,for php you can use lamp etc
Drop the json file in the public accessible app server directory
Start the app server,and you should be able to access the file from localhost
For Linux Python users:
import webbrowser
browser = webbrowser.get('google-chrome --allow-file-access-from-files %s')
browser.open(url)
url should be like:
createUserURL = "http://www.localhost:3000/api/angular/users"
instead of:
createUserURL = "localhost:3000/api/angular/users"
Many problem for this, with my problem is missing '/' example:
jquery-1.10.2.js:8720 XMLHttpRequest cannot load http://localhost:xxxProduct/getList_tagLabels/
It's must be: http://localhost:xxx/Product/getList_tagLabels/
I hope this help for who meet this problem.
I have also been able to recreate this error message when using an anchor tag with the following href:
Example a tag
In my case an a tag was being used to get the 'Pointer Cursor' and the event was actually controlled by some jQuery on click event. I removed the href and added a class that applies:
cursor:pointer;
cordova achieve this. I still can not figure out how cordova did. It does not even go through shouldInterceptRequest.
Later I found out that the key to load any file from local is: myWebView.getSettings().setAllowUniversalAccessFromFileURLs(true);
And when you want to access any http resource, the webview will do checking with OPTIONS method, which you can grant the access through WebViewClient.shouldInterceptRequest by return a response, and for the following GET/POST method, you can just return null.
If you are searching for a solution for Firebase Hosting, you can run the
firebase serve --only hosting command from the Firebase CLI
That's what I came here for, so I thought I'd just leave it here to help like ones.
If your using VS code just trying loading a live server in there. fixed my problem immediately.

Install Pageturner within Plone 4.1

We have updated our Plone sites to version 4.1
We would like to install also the Pageturner PDF viewer in the site to give our users more functionality on medical articles which are published as PDFs.
We added in the buildout file two parts:
eggs =
.......
wc.pageturner
zcml =
.......
wc.pageturner
We also installed on our Ubntu server machine the swftools with the command:
$ sudo apt-get install swftools
Running buildout is going well without any errors.
After starting the zinstance the Plone sites are not available within the browser.
If you are interested we could write some functional documentation to add on plone.org with the product.
What can I do? where is the error? Am I missing something?
It seems a common problem relative to the way permissions are handled in Plone 4.1.
Just add the following:
<include package="Products.CMFCore" file="permissions.zcml"
xmlns:zcml="http://namespaces.zope.org/zcml"
zcml:condition="have plone-41"
/>
in every configure.zcml file where CMF permissions (cmf.ManagePortal, cmf.ModifyPortalContent, etc.) are mentioned.