Wordnet Morphmaps: Why does WordNet 3.0 have limited number of morphmaps - wordnet

I have been analysing the WordNet 3.0 MySql database files, that I downloaded from:
http://www.princeton.edu/wordnet/download/current-version/
And have notices that there are only 4962 morphmaps present. This means, that if I search for the term "Carrying" I get zero results.
If I search for "Carrying" however, on the WordNet 3.1 search page: http://wordnetweb.princeton.edu/perl/webwn
I get directed to "Carry" - which is an expected result.
I assume that Wordnet 3.1 has more rows in its morphmaps table, and that is why "Carrying" yields the correct result.
Does anyone know:
1) Does WordNet 3.1 have more morphmaps than 3.0?
2) Where I might source the 3.1 mysql database files for Wordnet 3.1
3) Are there other ways of getting more morphmaps into Wordnet?
Thanks for your kind assistance

You don't use morphmaps to convert a word into its base form. You use lemmatizer.
You can check out this link to see many lemmatizers available.
The Web version of WordNet already incorporates some software, it's not just plain search on the database. And even so, after lemmatization it's searching on the sense database (say, index.sense), not morphmaps database.

Related

Where can I find Cypher Grammar specification?

As a part of my assignment at the University I have to go trough Cypher specification. The link on openCypher site is broken. Is this what is published on gihhub current version of openCypher specification? I don't see the version denoted anywhere.
The file name in the grammar link has not been updated to M19, like all the other links.
It should be https://s3.amazonaws.com/artifacts.opencypher.org/M19/grammar-M19.zip.
You should inform the openCypher project so that it can be fixed. I have created a pull request to fix the links: https://github.com/opencypher/website/pull/27

CSV export from ActiveScaffold

It looks like there have been a couple of plugins developed in past years for Rails/ActiveScaffold to export an ActiveScaffold index view as CSV. One of them (last commits around 5 years ago) is referenced in the answer here:
How to Make CSV Format Report of User Information using Active Scaffold in Rails 2.3.8
The projects I can find look like they're dead, and despite searching I'm not finding updated versions.
Before I dive in and modernise one of these, I wanted to ask: is there a current (ActiveScaffold 3.4) plugin that will allow CSV export?
edit: found https://github.com/naaano/active_scaffold_export which was last updated in 2013
It turns out that, despite its 2-year old update status, Naaano's GitHub project took almost nothing to get working with ActiveScaffold 3.4.
Head over to https://github.com/naaano/active_scaffold_export, grab the code, and follow the very well written install/config instructions. Then simply modify the gemspec to require AS >=3.4.0, rebuild the gem, and you're good to go.

Using WebApiContrib.Formatting.Xlsx stright from datatable/query

I have unknown data which I get it straight from query/datatable How do I use WebApiContrib.Formatting.Xlsx library? which it asks me to have a model for every xlsx reports. I have tried to generate dynamic class from datatable but it doesnt seem working.
This will be possible with the 2.0 release, which supports custom column resolvers and more robust serialisation for ExpandoObject. You can grab the prerelease version now on NuGet, and I plan to document the new functionality over the nice big break I have coming up soon.

missing packages in lucene 4.0 snapshot

anybody knows why there is no QueryParser, nor IndexWriter.MaxFieldLength(25000) and some more in Lucene 4.0 Snapshot?
I'm having hard time to port the code to this newer version, though I'm following the code as given here: http://search-lucene.com/jd/lucene/overview-summary.html
How do I find the missing packages, and how do I get them? As the snapshop jar doesn't contain all the features..
thanks
Lucene has been re-architectured, and some classes which used to be in the core module are now in submodules. You will now find the QueryParser stuff in the queryparser submodule. Similarly, lots of useful analyzers, tokenizers and tokenfilters have been moved to the analysis submodule.
Regarding IndexWriter, the maximum field length option has been deprecated, it is now recommended to wrap an analyzer with LimitTokenCountAnalyzer (in the analysis submodule) instead.

Field.Store.COMPRESS in Lucene 3.0.2

I am upgrading lucene 2.4.1 to 3.0.2 in my java web project
in lucene API's i found that Field.Store.COMPRESS is not present in 3.0.2 so
what i can use in place of Field.Store.COMPRESS?
some time field data is so large that i have to compress that.
Lucene made the decision to not compress fields, as it was really slow, and not Lucene's forte. The Javadocs say:
Please use
CompressionTools instead. For string
fields that were previously indexed
and stored using compression, the new
way to achieve this is: First add the
field indexed-only (no store) and
additionally using the same field name
as a binary, stored field with
CompressionTools.compressString(java.lang.String).