Automatically import data into SQL Server - sql

I am trying to import data from a .CSV file into SQL Server automatically. For example, if I have a set of .CSV files in a folder, I want the data from those .CSV files to be imported into SQL Server automatically, every time I add a new .CSV file to that folder.

If you want it to be truly automatic, you will need to go beyond SQL. Otherwise, scheduling a load from the csv to your database would require a simple SSIS package. Then you can go and schedule a job using SQL Server Agent to import the data daily, weekly, hourly, etc.

EDITED!! THIS IS FOR PHP
$filenames = glob($_SERVER['DOCUMENT_ROOT'] .'/filespath/*');
foreach($filenames as $filename){ // iterate files
if(!file_exists($filename) || substr($filename,-3) == 'csv') continue;
$file = fopen($filename, "r");
$csv = array();
while (($line = fgetcsv($file)) !== FALSE) {
//$line is an array of the csv elements
$csv[] = $line;
}
fclose($file);
foreach ($csv as $key => $value) {
//put your code here to work with $csv array.
}
}
You can use this code to get what you want. You also need cron to automate your code. You can get it in your host cpanel.

Related

Laravel move files from one disk to another disk - using `Storage`

I have two disks defined in my filesystems.php config file:
'd1' => [
'driver' => 'local',
'root' => storage_path('app/d1'),
],
'd2' => [
'driver' => 'local',
'root' => storage_path('app/d2'),
],
These disk could also be Amazon S3 buckets, and there could be combination of S3 bucket and a local disk.
Let's say I have a file as app/d1/myfile.txt which I want to move to app/d2/myfile.txt.
What I'm doing now is
$f = 'myfile.txt';
$file = Storage::disk('d1')->get($f);
Storage::disk('d2')->put($f, $file);
and leaving the original file on d1 as it doesn't bother me (I periodically delete files from d1).
My questions are:
Is the code below atomic, how would I check if it was, and if not how would I make it atomic (for the scenarios when the files are 1GB or something similar in size):
$f = 'myfile.txt';
$file = Storage::disk('d1')->get($f);
Storage::disk('d2')->put($f, $file);
Storage::disk('d1')->delete($f);
Is there a simple way to move files from one disk to another using the Storage facade. At the moment I need it to work from one local disk to another but in the future I might need to move them from one S3 bucket to the same one, from one S3 bucket to another one, or from local disk to a S3 bucket.
Thanks
I think this way is cleaner and works if you're using remote paths
$directories = ['dir1', 'dir2', 'dir3'];
$from = 'public';
$to = 'assets';
foreach($directories as $directory){
$files = Storage::disk($from)->allFiles($directory);
foreach ($files as $file) {
Storage::disk($to)->writeStream($file, Storage::disk($from)->readStream($file));
// If you no longer need the originals
//Storage::disk($from)->delete($file);
}
Storage::disk($from)->deleteDirectory($directory);
}
The move method may be used to rename or move an existing file to a new location.
Storage::move('old/file.jpg', 'new/file.jpg');
However, to do this between disks you need to have the full path to the file to move.
// convert to full paths
$pathSource = Storage::disk($sourceDisk)->getDriver()->getAdapter()->applyPathPrefix($sourceFile);
$destinationPath = Storage::disk($destDisk)->getDriver()->getAdapter()->applyPathPrefix($destFile);
// make destination folder
if (!File::exists(dirname($destinationPath))) {
File::makeDirectory(dirname($destinationPath), null, true);
}
File::move($pathSource, $destinationPath);

export data from bigquery to cloud storage- php client library - there is one extra empty new line in the cloud storage file

I followed this sample
https://cloud.google.com/bigquery/docs/exporting-data
public function exportDailyRecordsToCloudStorage($date, $tableId)
{
$validTableIds = ['table1', 'table2'];
if (!in_array($tableId, $validTableIds))
{
die("Wrong TableId");
}
$date = date("Ymd", date(strtotime($date)));
$datasetId = $date;
$dataset = $this->bigQuery->dataset($datasetId);
$table = $dataset->table($tableId);
// load the storage object
$storage = $this->storage;
$bucketName = 'mybucket';
$objectName = "daily_records/{$tableId}_" . $date;
$destinationObject = $storage->bucket($bucketName)->object($objectName);
// create the import job
$format = 'NEWLINE_DELIMITED_JSON';
$options = ['jobConfig' => ['destinationFormat' => $format]];
$job = $table->export($destinationObject, $options);
// poll the job until it is complete
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($job) {
print('Waiting for job to complete' . PHP_EOL);
$job->reload();
if (!$job->isComplete()) {
//throw new Exception('Job has not yet completed', 500);
}
});
// check if the job has errors
if (isset($job->info()['status']['errorResult'])) {
$error = $job->info()['status']['errorResult']['message'];
printf('Error running job: %s' . PHP_EOL, $error);
} else {
print('Data exported successfully' . PHP_EOL);
}
I have 37670 rows in my table1, and the cloud storage file has 37671 lines.
And I have 388065 my table2, and the cloud storage file has 388066 lines.
The last line in both cloud storage files is empty line.
Is this a Google BigQuery feature improvement request? or I did something wrong in my codes above?
What you described seems like an unexpected outcome. The output file should generally has the same number of lines as the source table.
Your PHP code looks fine and shouldn't be the cause of the issue.
I'm trying reproduce it but unable to. Could you double-check if the last empty line is somehow added by another tool like a text editor or something? How are you counting the lines of the resulting output.
If you have ruled that out and are sure the newline is indeed added by BigQuery export feature, please consider opening a bug using the BigQuery Issue Tracker as suggested by xuejian and include your job ID so that we can investigate further.

Magento2 read uploaded csv file

I have upload CSV file but problem is i don't know how to read data of uploaded csv file in Magento2.
Please help me anyone.
Thanks in advance.
You can do that like you could in Magento 1. In Magento 1 you would use
new Varien_File_Csvbut in Magento 2 you can do the same with \Magento\Framework\File\Csv. You can use the following code.
In your __construct() inject the following classes:
protected $_fileCsv;
public function __construct(
\Magento\Backend\App\Action\Context $context,
\Magento\Framework\Module\Dir\Reader $moduleReader,
\Magento\Framework\File\Csv $fileCsv
) {
$this->_moduleReader = $moduleReader;
$this->_fileCsv = $fileCsv;
parent::__construct($context); // If your class doesn't have a parent, you don't need to do this, of course.
}
Then you can use it like this:
// This is the directory where you put your CSV file.
$directory = $this->_moduleReader->getModuleDir('etc', 'Vendor_Modulename');
// This is your CSV file.
$file = $directory . '/your_csv_file_name.csv';
if (file_exists($file)) {
$data = $this->_fileCsv->getData($file);
// This skips the first line of your csv file, since it will probably be a heading. Set $i = 0 to not skip the first line.
for($i=1; $i<count($data); $i++) {
var_dump($data[$i]); // $data[$i] is an array with your csv columns as values.
}
}

Max file uploadsize

I've the following code to upload a temp file within the server.
I would like to check the file size. If the file is larger than 25MB I would like to abort the proces. If the file size is less than 25MB I would like to proceed with the upload.
$tmp_name = $_FILES['filename']['tmp_name'];
$type = $_FILES['filename']['type'];
$file_name = $_FILES['filename']['name'];
$size = $_FILES['filename']['size'];
Anyone a suggestion?
Tnx for any reply.
You could use something such as:
if($_FILES['file']['size'] > 26214400) {
echo "File too large. Uploaded files can be no more than 25MB.";
} else {
... let them upload ...

Setting Sharepoint File Field Attributes

In out SP site, we have a library with files. These are files associated with a user. We now cstomized the user's profiles to accept a list of files. And now, to this list of files in the user's profile, we would like to add a reference to the file so that the user doesn't have to upload again.
Current Library:
/personal/my/User Files/[filename]
So, I was wondering how to do this? The data looks like this in the new User Files field (JSON):
{
[
{
"Id":"1",
"Title":"Test",
"Url":"\/personal\/my\/User+Files\/testfile.doc"
}
]
}
I have a csv file that I iterate over. The csv file contains the user name:filename pairs.
The Id value has to be gotten from the SP instance libarary at that location for that file.
Powershell code:
$upAttribute = "UserFiles"
$profile_info = Import-Csv profiles.csv
foreach ($row in $profile_info) {
$userId = $row.user # User ID
$fullAdAccount = $adAccount += $userId
#Check to see if user profile exists
if ($profileManager.UserExists($fullAdAccount))
{
$up = $profileManager.GetUserProfile($fullAdAccount)
$upAttributeValue += $row.filename # Filename
# CODE ??????
$up.Commit()
}
}
That is the all the data that I have.
Thanks for any and all help.
Eric
You will first need to add the custom property to the User Profile like so:
http://www.paulgrimley.com/2011/02/adding-custom-user-profile-property-to.html
Then this should help you out:
http://get-spscripts.com/2010/07/modify-single-value-user-profile.html
#Get user profile and change the value
$up = $profileManager.GetUserProfile($adAccount)
$up[$upAttribute].Value = $upAttributeValue
$up.Commit()