After reading (nearly) the whole ebook and taking a look at the API
i am still asking myself how to realize "traditional" web server behaviour with opa.
I understand (at least i believe that) that opa links external resources specified at
compile time into the executable, making them immutable and permament.
But what if, say, i would want to change the stylesheet of an application without recompiling it?
There seems to be a few methods in the stdlib (apidoc) but they are not covering
what i am used to from other programming languages.
A possible solution i could think of is making use of the internal database,
but that looks like a bit of an overkill for something simple like traditional File I/O.
Edit: this blog post explains more about dealing with external resources in Opa.
Long story short: you'll rarely work with external files in Opa.
Let me try to break this down. Opa will indeed embed resources. But for development mode you indeed just want to be able to tweak them (mainly CSS) and see changes immediately. If you compiled your program in a non-release mode then it will support this kind of actions (try --help, below is an excerpt)
Debugging Resources : dynamic edition:
[...]
--debug-editable-css
Export the CSS files embedded in the server to the file
system, so that they can be viewed and edited during
execution of the application
For many other editable&changing resources one would indede use the database.
And if you really need to work with files (again: with Opa you'll need it much less than with traditional web languages) then take a look at stdlib.io and, for advanced use, at BslFile module with bindings to Ocaml functions for file manipulation.
I think this module is for you :
http://opalang.org/resources/doc/index.html#file.opa.html/!/value_stdlib.io
import stdlib.io
my_css = File.content("css/file.css")
I am not seeing some way to write file, but I think if you need to write you should use the db.
But to read I think this is the solution :)
Related
I am looking at using a static generator to generate up to hundreds of thousands of pages (on S3) of data-driven content from json or csv files, each of which has an html form that posts to an external API. Is this a feasible undertaking?
It depends on your requirements, but at minimum, you might even get away with a simple node program that uses fs to read/write. Going up the complexity spectrum, you might do with a Gulp setup. Going even further up the spectrum, you can use static website generator to read/write your data files (but that's probably worth the trouble only if you already know static generator and/or you will want to have a blog on S3 as well, driven by .MD files, besides hundreds of thousands of data-driven pages).
If going simple node script route, you would create your local application in a js file, run it through command line in node. It would generate thousands of pages locally, then you would upload them to S3. You can either use standard fallbacks or fancier way using promises (like using Bluebird). This way is the most manual but you have the most control over the result.
For the record, you could whip up a script in any programming language that you are proficient, like for example, PHP. JavaScript is popular these days, that's why I'm assuming you would use JS.
If going Gulp route, I imagine a custom function that reads data files from the location, parses their contents into an array, and writes the contents into files.
If going Hugo route, simply use data driven content reference, getCSV function. You'll still need to work in the context of a website, this means the more you stray off the website's setup, the more you'll have to fight Hugo.
As I mentioned, arguments against static website generator would be if you don't need the website part, only to perform operations on data and write files, it might stand in a way.
Hugo is a good option for thousands of files because it's fast.
Solution also depends on if your CSV files are going to change, or is it one-off thing; also how much automation you need. Gulp approach might be handy even if you go Hugo route.
So, yes, it is a very feasible undertaking.
I'm new to MODx, but am quite impressed with its power and flexibility. There's only one caveat, and I'm hoping it's just because I don't know any better.
I'm a frontend dev, and I'm used to building websites of all sizes. But I usually work with files and version control. How would I keep this paradigm with MODx?
From my poking around so far, the only way I found to use an IDE, is to keep static files with my code, to later on copy/paste into MODx Manager. Far from ideal.
I'm aware that a lot of people use an "include" snippet, to include snippets, chunks, etc. Does this work for MODx specific tags? For example, if I include a file as a snippet, and I have a template variable defined in there (or a resource link), would that be properly rendered?
Also, is there a performance hit using a snippet by including a file, vs having the snippet code entered into MODx Manager?
Bottom line, how do you develop sites on MODx? Where do you enter your code? Is there a feature like the "Import HTML" but for snippets and chunks? Is there a way to create new Templates, Documents, Chunks, TVs, etc. without going through the Manager?
Thanks in advance!
there is a whole documentation site for developing in modx, http://rtfm.modx.com/display/revolution20/Home - though it mostly concerns extending it - not customization & modification. The short answer is no, there is no version control for your snippets & such, yes, you will have to maintain them manually. [I wish that was not the case]
Most of your php code will go into either a snippet or a plugin, and yes you can include static files in either of those resource types, no, I on't know if there is a performance gain/loss, but I would imagine "no" if your include is cache-able.
for the includes you can do something like this:
include_once $modx->config['base_path'].'_path_to_my.php_';
-sean
There is VersionX for revolution that will allow you version control of chunks, snippets, resources and so on.
There is package called Auditor that will allow you to implement version control in Modx
EDIT
Sorry just noticed your question is tagged Revolution, Auditor is for Evo. I don't think there's a solution available yet although I believe it is on the Roadmap
Suppose I am working on exposing some of my server-side classes to a GWT application, but certain parts could be done much better using GWT-specific components (like JSNI, for instance).
What are some techniques for doing so without being too hacky?
For instance, I am aware of using a subpackage and using the <super-source/> tag, but this requires the package names to be different, which causes eclipse to complain. The general solution in the community is to then tell eclipse to use that as a source folder, but then eclipse complains about there being two classes with the same name.
Ideally, there would just be a way to keep everything in a single source tree, and actually have different classes which apply the alternate implementations. This would feel like a more OO approach.
I would like to add a suffix to a class like _gwt which accomplishes this automatically, and I know I could write a script to do this kind of transformation, but that is a kludge for sure.
I've been considering using Google's GIN/GUICE libraries for my projects in general, and I think there might be some kind of a solution there, but I am not sure as I have not thoroughly investigated it.
What are some solutions you have tried in the past on GWT projects?
The easiest way to have split implementations is to use super-source code, but only enough to instantiate a uniquely-named instance or dispatch to a different method. Ideally, the super-source implementation is just a few lines long, and not so bad that you can't roll it by hand.
To work around the Eclipse / javac double-mapping and package name issues, the GWT source uses two top-level roots for user code: user/src and user/super. For example, the AutoBeans package has a split-implementation of JSON quoting and evaluation, one for the JVM and one for the browser.
There's really no non-kludgy way to implement super-source, as this is a feature way outside what you can specify in the language. There's nothing that lets you say "use this implementation in this environment" without the use of some external tool.
I've been trying to figure out if there is already an accepted method for testing file io operations in Haskell, but I have yet to find any information that is useful for what I am trying to do.
I'm writing a small library that performs various file system operations (recursively traverse a directory and return a list of all files; sync multiple directories so that each directory contains the same files using inodes as the equality test and hardlinks...) and I want to make sure that they actually work, but the only way I can think of to test them is to create a temporary directory with a known structure and compare the results from the functions executed on this temporary directory with the known results. The thing is, I would like to get as much test coverage as possible while still being mainly automated: I don't want to have create the directory structure by hand.
I have searched google and hackage, but the packages that I have seen on hackage do not use any testing -- maybe I just picked the wrong ones -- and anything I find on google does not deal with IO testing.
Any help would be appreciated
Thanks, James
Maybe you can find a way to make this work for you.
EDIT:
the packages that I have seen on hackage do not use any testing
I have found an unit testing framework for Haskell on Hackage. Including this framework, maybe you could use assertions to verify that the files you require are present in the directories that you want them to be and they correspond to their intended purpose.
HUnit is the usual library for IO-based tests. I don't know of a set of properties/combinators for file actions -- that would be useful.
There is no reason why your test code cannot create a temporary directory, and check its contents after running your impure code.
If you want mainly automated testing of monadic code, you might want to look into Monadic QuickCheck. You can write down properties that you think should be true, such as
If you create a file with read permission, it will be possible to open the file for reading.
If you remove a file, it won't open.
Whatever else you think of...
QuickCheck will then generate random tests.
I'm writing a new document-based cross-platform chemistry application (Win, Mac, Unix), which saves files in its own format (no standard format exists for this field). I'm trying to decide on a file extension for the saved files. My questions are:
How important is it nowadays to stick to 3 characters?
Where can you check how much this file extension is already used? (Google helps, of course, but it does not tell me how much a given app is popular)
Do I really need to use a file-specific extension? My save format is gzip'ed XML, so I could name it .xml.gz, but I fear it would confuse beginning users (i.e. when you see it, it does not immediately "ring a bell").
Finally, do you have other important guidelines when choosing for your own programs?
PS: I tried to keep the right balance between "giving too little information" and "being too specific to be really useful to others". I'll happily provide more information in comments if the need arises.
FileInfo.com lists a lot of file extensions along with their own estimation of how much it is ued.
I suggest a unique extension (rather then xml.gz) so that the OS can identify the file type to users when looking at a file listing or whatever. 'Ringing a bell' is important, especially if you will have less sophisticated users.
I don't see any need to stick to 3 characters, but I wouldn't go bigger than 5 (I don't suppose I have a real reason for this, other than personal preference).
How important is it nowadays to stick to 3 characters?
It's not unless you have to support older operating systems. All current OSes handle >3 char file extensions without any problems. Think of .html, .config, .resx, and I'm sure there are more.
Where can you check how much this file extension is already used?
check out FileExt.
Do I really need to use a file-specific extension? My save
format is gzip'ed XML, so I could name
it .xml.gz, but I fear it would
confuse beginning users (i.e. when you
see it, it does not immediately "ring
a bell").
Remember that windows (and windows users) associate files with applications by extension, so using something too generic like .xml.gz may cause problems. You are probably better coming up with something that is more specific to your file type or application. Users don't care weather your format is gzipped xml internally, they care about what is in the file. Think about abstraction layers, your users will think of it as a file containing chemistry info not gzipped xml, so .chem is far more appropriate than .xml.gz
Some suggestions of things to thing about:
Obviously, don't clash with anything big - Don't use .doc, .xls, .exe, etc.
Don't clash with anything common in your industry domain that your user demographic is likely to have installed. For example, if you are writing a programming tool, don't use .cs or .cpp. You probably know your domain best, so write a list of all the apps you and your users are likely to have installed, and any of their competitors and avoid them.
Make sure your app includes the options to register and unregister the extension. don't just automatically do it in the installation, make sure it's an option.
Remember unix/linux and Mac are case sensitive, so consider sticking to always all lower case by default.
Remember CD/DVD file naming rules are stricter, so don't use non alpha numeric characters.
Finally, remember that most non-tech users are going to have file extensions turned off, so don't stress about it too much.
There is more info here.
Wikipedia has lists of files extensions here (by type) and here (alphabetical), and also some general information
Depends on the platform, but in general, not very important for newer Operating Systems. Check the documentation for the platforms you're targeting.
I'm not aware of better alternatives to Google. Hopefully someone else has a better suggestion for this one.
Not unless you have some reason to do so. Examples would be "I want to ensure that Windows always opens this program with my app". I'm not sure that your users need to be concerned with the extension anyway. The default configuration on Windows, for example, is to hide extensions for known file types. BUT if you have a compelling reason (such as allowing your program to easily identify files it should be able to handle, for example) then you could use the extension, or you could come up with something else.
I have only ever once written a program where I thought I needed to come up with my own extension. I used my initials. Then later I realized I didn't really need a special extension and reverted to ".xml". However, most extensions seem to be something that seems to mean something. (.doc for documents, etc.) so something meaningful is a good idea if you do need to go this route.
It sure depends on the OSes you want to support, but people have globally moved over the 3-characters extension limit these days: .html is well used for webpages, for example.
Of course, if you go to much longer extensions, people will stop visually recognizing it as a file extension, I think...
Barring your needing to be compatible with a specific OS that you know still has the three-letter limitation, no need to keep it to three characters. It may be useful to have a three-character version of it if you end up supporting those platforms.
The Wikipedia list of file formats is pretty good. Some mime mapping lists will list common extensions associated with those mappings. Ray already mentioned FileInfo.com.
It's a convenience thing; I'd probably go with your own but document the fact that they're just gzipped XML files conforming to a specific DTD and make it easy for users to use .xml.gz instead. Be sure that your software doesn't care about the extension, so that users could even choose their own if they wanted, although I'd tend to avoid encouraging them to by providing a reasonable default.
I'd go for typeability, clarity, uniqueness, and brevity -- in that order. For instance, .config is a lot easier to type than .q2z but it falls down on uniqueness. (I'm not suggesting it for your app; it's an example.) Similarly, .q2z is just a pain. :-) So for instance, .chemstuff is easy to type and probably not in wide use elsewhere. (Again, not a suggestion, just an example.)
Have it as document_name.app_name.xml.gz where document_name and app_name are variables, the latter some easily readable and recognisable short string of your application's title.
Modern systems are quite flexible, and there is absolutely no need to drag the 3-character extensions further along in time with us.
I agree that .xml.gz would confuse users, however keep in mind that modern systems are moving into recognizing files not based on extensions but by probing their headers and even contents instead. In fact, users do not often even see the extensions. For gzipped XML files, a system may decide to first unpack the file stream in memory, then find out it is a literal XML file, then it may take its 'xmlns' as the application identifier. However, such systems are not yet widespread use. In any case, don't make the mistake of only opening files by extension - be smart and raise the bar - do exactly the above to find out if the file can be considered a document for your application.