AEM 6.1 LESS version - less

I've have been searching and digging around but for the life of me I can not find which version of LESS is included with AEM 6.1 (or 6.2). I know that CQ (AEM) 5.6.1 uses LESS 1.3.3 but I'd like to know if the version was updated with AEM 6.

-The compilation of the LESS logic is done on the server side using Rhino (a Javascript compiler for Java) and LESS
-If you go to your Felix console and look under Bundles you should find one called "Adobe Granite UI Clientlibs - Less Compiler(com.adobe.granite.ui.clientlibs.compiler.less)".
-To get to the JAR file for this bundle you will need to note the number next to it in the console.
(In my case it is bundle number 193.)
-Assuming the bundle number is 193, on your file system go to "/crx/quickstart/launchpad/felix/bundle193/version0.0"
-If you open bundle.jar you will find version of LESS.
(For extracting JAR file you can use jd-gui.exe jar extractor.)
-In AEM 6.1 the current version of LESS is-1.7.5

The version of LESS is included with AEM 6.1 is -1.7.5
To get to the JAR file for this bundle you will need to note the number next to it in the console in my case it is bundle number 74.
Assuming the bundle number is 74, on your file system go to "/crx/quickstart/launchpad/felix/bundle74/version0.0"

Related

Neo4J Plug In Won't Compile Due to Version Mismatch

I have seen variants of this question asked, but I have not seen an answer that was accepted. I am using IntelliJ IDEA to simply compile the TraversDemo sample plug in that Neo4J posted on GitHub. When I run mvn clean package I get no errors. But when I actually tried putting the .jar file into my plug ins folder on Neo4J Desktop . . . neo4j didn't recognize the plug in. So, I assumed something was wrong with my .jar file and I just tried to 'build' the java file in IntelliJ. When I did, I get this error:
Module 'TraversalPlugIn' production: java.lang.UnsupportedClassVersionError: org/neo4j/annotations/api/PublicApiAnnotationProcessor has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 54.0
I have checked and rechecked my Java (for Windows 10) is Java 8. I checked the Java Configuration GUI and asked for updates. It said I had the most recent version. Does anyone have any idea how to solve this? I already tried screwing around with various options in Build,Execution,Deployment Java compiler. No combination seemed to work.
I finally fixed this. The solution was: (i) cut existing java code (and POM file); (ii) delete original project altogether; (iii) create a new maven project but choose JDK 16 as SDK this time; and (iv) paste Java code and POM xml into new project. Then I built it, and ran maven clean package, inserted the target snapshot into the plug ins folder and ... it finally worked.

Bundles overridden by Karaf FeatureService.installFeatures()

We are using Karaf 4.1.7, and while creating Karaf distribution we package Jackson core 2.9.7 bundle within karaf system directory.
And in one of our applications we use Jackson-core 2.9.2 bundle which is added to a feature. When we install this feature using Karaf FeatureService.installFeature(featureName) it overrides the existing 2.9.7 bundle instead of adding a new bundle. We were expecting that both the versions would be available or coexist.
And when I type 'list' command can see the overridden bundle version 2.9.2 instead of 2.9.7. And another strange thing is when we checked through bundleContext.getBundles() I still see 2.9.7 bundle. And in the bundle cache I see version0.1 folder in which the new 2.9.2 jar is present.
Could any one provide pointers on how to resolve this or what we are missing here. Please let us know if you need more details.
Thank You,
Dheeraj
Both jackson-core bundles of different versions should be installed in the karaf instance and likely get resolved. It does not get overidden. If you call the list (which gets resolved to bundle:list by default) shell command it does print out all bundles having an OSGi start level equal or greater 50. The first line does shows this:
karaf#root()> list
START LEVEL 100 , List Threshold: 50
So the first installed bundle might get skipped. Decreasing the bundle start level threshold to 0 via command line
bundle:list -t 0
all installed bundles of the karaf instance will get listed.
Regarding the result of bundleContext.getBundles(): only one version of another bundle can be bound and thus the first/original bound bundle version gets listed for the certain bundle (context).

Why do I have multiple gradle wrapper distributions downloaded (of the same version)?

I'm looking around in my ~/.gradle folder (relocated to P:\caches\gradle, which doesn't affect this behaviour) and I found some strange folders.
In the past I have updated multiple projects from 2.0 to 2.1 via running gradle wrapper which updated project\gradle\wrapper\gradle-wrapper.properties
I just found out that for some reason the exact same distributionUrl zip file was downloaded multiple times. As you can see they all have the same size and I did a binary diff, they're the same bit by bit except the timestamps:
The contents of each folder above look like this:
I just updated from 2.2 to 2.2.1 and it happened again: this time it's 2 folders not 3. Does anyone know why this happens?
I'm using Gradle from both IntelliJ Idea and the command line.

In "Mono 3.0.10.20" what does 20 mean?

I want to download the latest Mono MDK of the 3.0.10 branch.
On the download site, I see that 3 packages are available:
QUESTION: What is the meaning of "20" in 3.10.0.20 ?
I find it strange that the 3.0.10.20 file is older than the 3.0.10.0 file.
The Mono versioning scheme documentation does not mention it.
The 20 (or 19) is just a build number.
The package without the build number is probably supposed to be a copy of the latest package, and then someone forgot that when 3.10.0.20 was released (so it's identical to 3.10.0.19).

hadoop with MultiInputs, TotalOrderPartitioner-----------with hadoop-eclipse-plugins

I have got two questions:
Now I have used hadoop0.20.203 and hadoop1.0.0. But I found that both of the two versions have no classes like MultiInputs , TotalOrderPartitioner and so on. (I open the $HADOOP_HOME/hadoop-core-1.0.0.jar file , and don't find the .class files in ort/apache/hadoop/mapreduce/lib/input/*).
But I have to use them to do some jobs. Did I miss anything? What version should I choose?
I want to find a plugins for hadoop1.0.2 ( because I found this version the hadoop-core-1.0.2.jar file has the class that I want.) I want to find one that was compiled. where can I find it?(I have got some but seems unavailable)
In a word, What my object is only to find an available hadoop---version,and plugins for eclipse. This version can deals with TotalOrder and so on. What should I do? Thanks in advance.
If they exist in 1.0.2, can you use this version instead? if your cluster currently uses 1.0.0 and you can't upgrade it to run 1.0.2, then you can get the source for these two files and add them into your job jar (as if they were code you had written).
1.0.2 is available for download from the mirrors list:
http://www.apache.org/dyn/closer.cgi/hadoop/common/
For example: http://apache.mirrorcatalogs.com/hadoop/common/hadoop-1.0.2/hadoop-1.0.2-bin.tar.gz