Is there a way to have a file path in T-sql and PostgreSQL that stays within the project directory? - sql

I am importing data from a CSV file using the COPY FROM in PostgreSQL. It works flawlessly on my machine, however, were I to clone the repository onto a new machine the code would cease to function to the file path being hardcoded starting at the Users directory of the computer.
In other languages, I would be able to use something like ./ or ~/ to start somewhere not at the absolute beginning of the file, but I haven't found T-SQL or Postgres to have that functionality available.
What I have
COPY persons(name,address,email,phone)
FROM '/Users/admin/Development/practice/data/persons.csv
How can I make that file path function on any machine the project gets cloned to?

Related

Github in Parent Directory of Google Colab

I'm a noob to Google Colab and Python. I'm attempting to import a custom set of scripts from a Github directory. I'm using the following:
!git clone https://github.com/theAIGuysCode/tensorflow-yolov4-tflite.git
By default, this will export to a folder that it names based on the git name. However, the functions in the needed scripts call the parent directory and not the git folder name. Example:
Google Colab Screenshot
Is there a method for importing the git in the parent directory so the scripts can run without modifying the file hierarchy in each script?
The error is that you are in a different directory. Most likely current directory is /content/ if those two cells in the picture are on top.
You need to change directory before you can call save_model.py, then it will work as expected. Use !pwd to know the current directory.
Before the last cell change directory to the one where desired code is. So in this case it can be,
%cd "/content/tensorflow-yolov4-tflite"
If you are unsure about path, right click on folder and select Copy path to use with cd command.

Laravel delete file while developing windows

In my local development environment (xampp, windows) I have the file
D:\Documents\repos\somerepo\public\img\bathroom\8\small\someimage.jpg
I try to delete it with:
Storage::delete(public_path('img/bathroom/8/small/someimage.jpg'));
However the file still exists after running the code.
When I print the path of public_path('img/bathroom/8/small/someimage.jpg') And copy it inside file explore it opens the file just fine.
How can I make laravel delete the file?
When I run:
if(File::exists($path)){
return $path." Does not exist";
}
Where path is public_path('img/bathroom/8/small/someimage.jpg') it tells me it does not exist.
Assuming you are using the default filesystem configuration, the public disk stores files in storage/app/public. The local disk uses storage/app. In short, both local disks manage files under storage/.
The public_path() helper returns the fully qualified path to the public directory. So:
public_path('img/bathroom/8/small/someimage.jpg')
will generate a path like this:
/your/project/public/img/bathroom/8/small/someimage.jpg
Note that this is not under storage/, where both Storage local disks operate. Passing Storage a fully qualified path outside the root of the filesystem it manages will not work.
To work with files outside the roots of the disks that Storage is configured for, you will have to use standard PHP functions like unlink(), etc. Alternatively, move the files you want to maintain with Storage into one of the disks it is configured for, add the symlink to make them visible, and update the references to those files in your views etc.

SSIS Deployment Variable Issue

I have created an SSIS package which uses a for each loop container and an Excel connection string that I have created from a variable so I can loop through multiple files. My package works without issue and if I have a number of files in my source folder and I simply execute the package it works perfectly looping through all the files doing what I want it to do.
The issue I have is when I deploy the package, If I have files within my source folder it executes without error but when you look at the source folder it still has the files in. When digging a bit deeper in to the package reports it looks like it is reporting that there were no files found. If I manually execute the dtsx file in runs without issue and imports everything as it should.
Is there any reason why after deploying the package it is unable to recognise the files or the variable that I store the file name in?
Sounds like it could be permissions related. Does the SQL Server Service account have permissions to the directory where the files are stored?

Copying all files from a directory using a pig command

Hey I need to copy all files from a local directory to the HDFS using pig.
In the pig script I am using the copyFromLocal command with a wildcard in the source-path
i.e copyFromLocal /home/hive/Sample/* /user
It says the source path doesnt exist.
When I use copyFromLocal /home/hive/Sample/ /user , it makes another directory in the HDFS by the name of 'Sample', which I don't need.
But when I include the file name i.e /home/hive/Sample/sample_1.txt it works.
I dont need a single file. I need to copy all the files in the directory without making a directory in the HDFS.
PS: Ive also tried *.txt, ?,?.txt
No wildcards work.
Pig copyFromLocal/toLocal commands work only for a file or a directory.It will never take series of files (or) wildcard.More over, pig concentrates on processing data from/to HDFS.Upto my knowledge you cant even loop the files in a directory with ls.because it lists out files in HDFS. So, for this scenario I would suggest you to write a shell script/action(i.e. fs command) to copy files from locally to HDFS.
check this link below for info:
http://pig.apache.org/docs/r0.7.0/piglatin_ref2.html#copyFromLocal

Using SSIS package to zip all the txt files and move to related folder [duplicate]

I am trying to zip the contents of a Folder in SSIS, there are files and folders in the source folder and I need to zip them all individually. I can get the files to zip fine my problem is the folders.
I have to use 7.zip to create the zipped packages.
Can anyone point me to a good tutorial. I haven't been able to implement any of the samples that I have found.
Thanks
This is how I have configured it.
Its easy to configure but the trick is in constructing the Arguments. Though you see the Arguments as static in the screenshot, its actually coming from a variable and that variable is set in the Arguments expression of Execute Process Task.
I presume you will have this Execute Process task in a For Each File Ennumerator with Traverse SubFolders checked.
Once you have this basic setup in place, all you need to do is work on building the arguments to do the zipping, how you want them. A good place to find all the command line arguments is here.
Finally, the only issue I ran into was not providing a working directory in the command line arguments for 7zip. The package used to run fine on my dev environment but used to fail when running on the server via a SQL job. This was because 7zip didn't have access to the 'Temp' folder on the SQL Server, which it uses by default as the 'working directory'. I got round this problem by specifying the 'working directory as follows at the end of the command line arguments, using the -ws switch:
For e.g:
a -t7z DestinationFile.7z SourceFile -wS:YourTempDirectoryToWhichTheSQLAgentHasRights