Why are there multiple versions of the date time controls in Office UI Fabric? - office-ui-fabric

There is for example a DatePicker and a Calendar control in the main package (office-ui-fabric-react) and in the date-time package as well (installed as #uifabric/date-time). Both APIs seem to be nearly identical, but differ in some properties. For example in Calendar the calendarDayProps property (requested in this issue) is missing in the main package, but only available in the date-time package. I only found this out by searching through the github issues, because there is no official documentation available for the date-time package. The documentation page available only refers to the main package version.
On the other hand, in the already closed issue, it is nowhere mentioned, that this fix only applies to the date-time package version, as if this should be clear somehow. I feel that I missed some information.
So what's the point here? Will the date-time package version at some point in time replace the main package version? Are they intended to be used interchangeably? Why is there no documentation on this? I'm really confused ...

Related

How to use a -Manifest package in Pharo Smalltalk after File-in/Install?

I just upgraded to the newest version of Pharo Smalltalk. Before doing so, I "File-outed" a package from my old version called My-Pharo - a package I use for various configurations and customizations of Pharo itself, most notably a class to put back "Workspace" in the main menu. I then "File-ined/Installed" the file into my new version.
When I checked the SystemBrowser, I had correctly gotten the My-Pharo package, but I'd also picked up a package called My-Pharo-Manifest... I see My-Pharo-Manifest actually is part of my File-Out, and seems to contain the package-comment for My-Pharo .
What is this manifest, what is it's purpose, and how should it be used? Is there something I can/should do to "merge" the manifest (ie. the comment) back into the My-Pharo class? Should I move the content of My-Pharo-Manifest somewhere else? ...Or is my best bet to simply delete the Manifest-package, and re-write the package-comment for My-Pharo?
I'm not a seasoned Pharo developer, I use it just time to time. I'll try to answer your question from the source code. For more detailed answer you would have to get it from the ones that are actually do the development of Pharo.
What is manifest?
Manifest contains package metadata.
what is it's purpose?
The purpose is to make life easier for the SmallLint (Smalltalk Code Critics). It is there for its speedup, because without the manifest the SmallLint would have to check the rule results all the time. Package metadata helps in managing false positives and/or TODOs.
packages: If you check for the where is the #hasPackageNamed: used, you will find out that it is at SmallLintManifestChecker>>manifestBuilderOfPackage:.
methods: if you search for #hasManifestFor: SmallLintManifestChecker>>manifestBuilderOfMethod:
Is there something I can/should do to "merge" the manifest (ie. the
comment) back into the My-Pharo class? Should I move the content of
My-Pharo-Manifest somewhere else?
I would just leave it be. It helps the SmallLint to do its job.

How to solve LaTeX package warning for "everypage"?

After updating MiKTeX, Texmaker and all installed packages, I receive an error when compiling the document. The error message is the following:
Package everypage Warning: Functionality similar to this package has recently been implemented in LaTeX. This package is now in legacy status. Please, don't use it in new documents and packages.
Package everypage Warning: You appear to be running a version of LaTeX providing the new functionality. Doing the best to deliver the original `everypage` interface on top of it. Strict equivalence is not possible, breakage may occur. If truly needed, Use `everypage-1x` to force the loading of an older code base.
The error is caused by the code \usepackage[some]{background}, also without the parameter "some". So the package background has a dependency to the package everypage which causes the error.
The document is compiling, but I would like to resolve the warning. How can I achieve this?
EDIT: Here you can find a compilable minimal example:
\documentclass[a4paper, parskip, 10pt]{scrartcl}
\usepackage[some]{background}%Warning
\definecolor{font}{RGB}{46, 49, 51}
\begin{document}
\color{font}
{\Huge Text}
\end{document}
Here is the cause: There was a recent update to LaTeX, documented here:
LaTeX News Issue 32, October 2020
The updated provided native hook management, including those related to page shipout - something that was provided by everypage. The package maintainer provided an update to everypage stating this:
Package is now in a legacy status. Functionality similar to that
provided by this package is directly implemented in LaTeX since its
2020 Fall release. On new enough LaTeX formats, everypage now merely
emulates its legacy interface on top of the new LaTeX mechanisms for
compatibility reasons, while on older formats, it fall backs to its
own previous code.
Do not use everypage in new documents and do not rely on it in new
packages or classes of yours.
The maintainer of background will have to update the package to utilise the new hooks rather than rely on everypage. Alternatively, write your own background-like macros (whatever that may be).
In the interim you could just suppress the warnings using silence:
\usepackage{silence}
\WarningsOff[everypage]% Suppress warnings related to package everypage
\usepackage[...]{everypage}

Is there a way to compare 2 Pypi package sourecode difference

I have already built pypi package stored on pypi server few days back. Now I want to compare source code diff between already built pypi package and recent code built today. Is there any way to this?
I want to compare already built pypi package and newly build code. And If there is any difference in source code then only create a new package and upload it to pypi server
If you have only Python bytecodes, you cannot get the corresponding source code (that hypothetical transformation is called decompilation, and is not possible in general; read e.g. about Rice's theorem). Since any translation (such as the one done by the python program) from source code to bytecode is losing some information (e.g. name of local variables, comments explaining the intent of the code).
Equality of the behavior of functions by static analysis of their source code (and the observable behavior of your code is what you really care about) is an undecidable problem. Learn more about λ-calculus, it is deeply related to that question.
The source code (by definition, the preferred form of code on which developers work) is not only for computers, but mostly for fellow developers: in other words, most of its value and its meaning is a social one (and that is what free software is about). Read more about the semantics of programs.
For example, renaming a variable from i to x may convey the implicit hypothesis that the intended dynamic runtime type of the value of that variable was an integer, and becomes a floating point.
Maybe you want some kind of package manager (or some version control system, if you deal with source code, or some build automation tool, if you build then install software out of it). Python has something to manage packages. The scons build automation uses Python, but there are many other build automation tools, GNU make being a common one (that you could use to drive compilation from .py source files to .pyc bytecode files and their installation). For version control, I recommend git.
PS. Your question is very unclear and smells like some XY problem.

project.json versioning format

I'm looking for a formal definition of version number formats for .NET Core project.json files.
version
Visual studio creates a default version number of "1.0.0-*". I would love for this to mean the * gets updated on successive builds (it doesn't). The build version number is 1.0.0. What does the * mean and what are the legal possibilities?
dependencies
I expected the dependency numbering to follow the nuget versioning rules given that KPM is basically a nuget front-end, but it doesn't appear to support bracket numbering (eg "[1,2)") - I get "not a valid version string" when I try anything other than a blank or x.x-* format.
Outside of the source, does anyone have a link to a formal definition?
I'm not sure what's wrong with looking into the source for a definition. I think that's the most accurate place to search, especially now that vNext is hosted on GitHub.
Looking at the exception described, we're pointed to SemanticVersion.cs.
In the method TryParseInternal, it's fairly obvious why you're running into issues when attempting to declare min/max versions that way. There is simply no handling for [,] or (,) built into that method.
If we look into the regular NuGet version specification, it's obvious that TryParseVersionSpec does have this handling built in.
As for documentation specifying acceptable formats, you'll probably have to wait until it's out of CTP status. If you believe it's an issue, you should document it in GitHub. The contributors are very responsive to these types of issues. Personally I'm not sure if there's a need for setting a maximum version of a dependency when it's deployed with your build.

Examples of Semantic Version Names

I have been reading about semver. I really like the general idea. However, when it comes to putting it to practice, I feel like I'm missing some key pieces of information. I'm not sure where the name of a library exists, or what to do with file variants. For instance, is the file name something like [framework]-[semver].min.js? Are there popular JavaScript frameworks that use semver? I don't know of any.
Thank you!
Let me try to explain you.
If you are not developing a library that you like to keep for years to come, don't bother about it.. If you prefer to version every development, read the following.
Suppose you are an architect or developer developing a library that is aimed to be used by hundreds of developers over time, in a distributed manner. You really need to be cautious of what you are doing, what your developers are adding (so interesting features that grabs your attention to push those changes in the currently distributed file). You dont know how do you tell your library users to upgrade. In what scenarios? People followed some sort of versioning, and interestingly, their thoughts all are working fine.
Then why do you need semver ?
It says "There should be a concrete specification for anything for a group of people to follow anything collectively, even though they know it in their minds". With that thought, they made a specification. They have made their observation and clubbed all the best practices in the world about versioning software mainly, and given a single website where they listed them. that is semver.org. Its main principles are :
Imagine you have already released your library with a version "lib.1.0.98", Now follow these rules for subsequent development.
Let your library is bundled and named as xyz and,
Given a version number MAJOR.MINOR.PATCH, (like xyz.MAJOR.MINOR.PATCH), increment the:
1. MAJOR version when you make incompatible API changes
(existing code of users of your library breaks if they adapt this without code changes in their programs),
2. MINOR version when you add functionality in a backwards-compatible manner
(existing code works, and some improvements in performance and features also), and
3. PATCH version when you make backwards-compatible bug fixes.
Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.
If you are not a developer or are not in a position to develop a library of a standard, you need not worry at all about semver.
Finally, the famous [d3] library follows this practice.
Semantic Versioning only defines how to name your versions. It does not specify what you will do with your version number afterwards. You can put the version numbers in package names, you can store it in a properties file inside your application, or just publish it in a wiki. All those options are opened to discussion and not part of the problem space addressed by SemVer.
semver is used by npm and bower (and perhaps some other tools) for dependency management. Using semver it is possible to decide which versions of which packages to use if multiple libraries used depend on the same library.
As others have said, semantic versioning is a standard versioning scheme that tells your users which versions of your library should be compatible with each other, and which ones are not.
The idea, is to be able to give your users more confidence that it's safe to upgrade to a newer patch/version, because it's tried, tested, and true to being backwards compatible with the previous version (minor increments). That is, perceptively that's what your telling your users.
As far as tooling goes, I don't do much in javascript, but I typically let my build server handle stamping my assemblies etc with the correct version. I have a static major number I upgrade whenever I make breaking changes, a static minor number I upgrade everytime I add new features, and an auto-incrementing Patch number whenever I checkin bug fixes.
Especially if this is a javascript library you plan to share on a public repository of some kind (nuget, gem, etc) you probably want some for of automated packaging system, and you put the logic in there for specifying your version number (in the package meta data, in the name of the javascript file, which is typically the standard I've seen).
Take a look at sbt which is the Scala Build Tool. In it, we write dependencies like this:
val scalatest = "org.scalatest" %% "core" % "2.1.7" "test"
val jodatime = "org.joda" % "jodatime" % "1.4.5"
Wherein the operator %% means "the current version of Scala that you're building." Packaging things in this language generally create JAR files with the name like this <my project>_<scala version>_<library version>.jar which is quite handy for semantically naming things automagically. The % operator can be interpreted as "don't version this part."
That said, this resulted from the fact that the same library compiled to different Scala versions were not binary compatible with each other. So it was more as a result of, rather than a conscious design choice, the binary incompatibilities.