Testing using local files - testing

I'm looking for what best practice I should use when it comes to testing with Go using local files.
By using local files, I mean that in order to test functionality, the application needs some local files, as the application reads from these files frequently.
I'm not sure if I should write temporary files myself just before running the tests using the ioutil package tempdir and tempfile functions, or create a test folder like so;
testing/...test_files_here
main.go
main_test.go
and then read from the contents inside
testing/...

A folder named testdata is usually used for this purpose as it is ignored by the go tool (see go help packages).

This is my current test setup:
app/
main.go
main_test.go
main_testdata
package1/
package1.go
package1_test.go
package1_testdata1
package2/
package2.go
package2_test.go
package2_testdata1
All the test data that is specific to a single package, is placed within the directory of that package. Common test data that will be used by multiple packages are either placed in the application root or in $HOME.
This set up works for me. Its easy to change the data and test, without having to do extra typing:
vim package1_test_data1; go test app/package1

Related

Extracting MSDeploy Zip package using variables

I’m setting up an automated build in VSTS that will FTP the published files to my server.
I have this working but the way I’ve achieved it, I feel is hacky and non-sustainable.
the process as you can see from the screenshots will publish the artefact which consists of a readme, cmd file and a zip containing all my publish files and then I extract the ZIP with the very explicit location below.
$(Build.ArtifactStagingDirectory)\temp\Content\d_C\a\1\s\IntermittentBug\IntermittentBug\obj\Release_EukHosts\Package\PackageTmp
I’m using a hosted build server in VSTS but as the path contains
d_C\a\1\s\
I assume this will change in time. What I need is a variable to cater for this path so it will always succeed.
How can I update this to make it more efficient and sustainable?
First, as jessehouwing said that the variable is called Build.SourcesDirectory.
Regarding the path structure, the simple way is specifying /p:PackageTempRootDir="" msbuild argument in Visual Studio Build task to remove the source path structure, then the path will be like Content\D_C\PackageTmp.
On the other hand, you also can publish the web app through File System mode.
This variable is caught in a predefined variable called Build.SourcesDirectory. see the complete list of predefined variables here.
In your batch or powershell scripts this variable is available as a environment variable called %BUILD_SOURCESDIRECTORY% / $env:BUILD_SOURCESDIRECTORY.

How to add new file to Go project

I am using the Go plugin for IntelliJ Idea.
I'm not sure how I got my project working like this but I am able to run my Main.go file and it includes all my other files in the project that I can reference. That is perfect.
The problem now is that when I go and create a new "*.go" file it's not included in the IDE build and I get compiled errors wherever I refer to the contents of that file.
How can I fix this?
i think you can set GOPATH in the ~/.bash_profile, e.g.
GOPATH=~/code/go
export GOPATH
and source ~/.bash_profile or restart the terminal. it will go to effect;
then put *.go or go project related with the main.go under the GOPATH.
finally, main.go will find the *.go.
There are two types of run configurations for Go applications:
- Go Single file -> which is the equivalent of go run file.go
- Go Application -> which is the equivalent of go build file / package and run the binary
By the sound of it, you want to run a Go Application with Run kind set to package. There you'll need to type the full package name, for example: github.com/dlsniper/demo/cmd/democmd
At the moment support for running multiple files / building a directory is not present (there are some issues opened for it)

Windows Form VB.Net - Attaching empty Directories for Deployment

I'm creating a GUI in order to launch a batch file which then kicks off a Powershell script. The GUI compiles fine and everything works great, however when I go to deploy the file it doesn't actually include any of the empty directories my script relies on.
How can I add empty directories to be included in my published VB form during install?
I don't think you can. Why don't you just do
If Not Directory.Exists(dir) Then
Directory.Create(dir)
End If
for each directory? I would create a list of directories over which to enumerate and run this each time the application is run.
You can always use the post build step to either create the directories you need or do other logic that your program may need such as run a batch file or power-shell script
See the example below. It will create a directory Test in the output directory where the .exe is placed.

Using SSIS package to zip all the txt files and move to related folder [duplicate]

I am trying to zip the contents of a Folder in SSIS, there are files and folders in the source folder and I need to zip them all individually. I can get the files to zip fine my problem is the folders.
I have to use 7.zip to create the zipped packages.
Can anyone point me to a good tutorial. I haven't been able to implement any of the samples that I have found.
Thanks
This is how I have configured it.
Its easy to configure but the trick is in constructing the Arguments. Though you see the Arguments as static in the screenshot, its actually coming from a variable and that variable is set in the Arguments expression of Execute Process Task.
I presume you will have this Execute Process task in a For Each File Ennumerator with Traverse SubFolders checked.
Once you have this basic setup in place, all you need to do is work on building the arguments to do the zipping, how you want them. A good place to find all the command line arguments is here.
Finally, the only issue I ran into was not providing a working directory in the command line arguments for 7zip. The package used to run fine on my dev environment but used to fail when running on the server via a SQL job. This was because 7zip didn't have access to the 'Temp' folder on the SQL Server, which it uses by default as the 'working directory'. I got round this problem by specifying the 'working directory as follows at the end of the command line arguments, using the -ws switch:
For e.g:
a -t7z DestinationFile.7z SourceFile -wS:YourTempDirectoryToWhichTheSQLAgentHasRights

jars, external properties, and external file io

I checked quite a few similar questions, but so far I am unsatisfied with the solutions.
Ever use the Minecraft Server? At initial launch, it creates all the files and folders it needs, and allows you to make changes to files like Server.properties and ops.txt by making them external of the executable jar file.
I'm working on a similar project, and I want to duplicate that behavior. Everything works great when I run it in eclipse. When I export to a jar file though, things get funky. The external files and folders are created without a hitch, but afterword, it would appear as though they cannot be read from or written to. Any ideas how Notch made his server?
--edit--
Scratch that, it doesn't even appear to reliably create the files and folders. Maybe it only creates them the very first run after creation?
--edit again--
It creates them in the root directory. When I tested it in eclipse, the root directory was limited to the folder containing the project, and therefore looked fine. The solution was to make the class aware of it's location, and include it in all file operations.
Have the main class in your executable jar file look up where it is, then have it store that information in a global String or something. Prefix your filenames with that string in your file operations, and voila! It's writing to the correct directory.