How to use pig in Visual Studio with Azure Hdinsight - apache-pig

I created my HDInsight on Azure and connected with cloud explorer on VS2017.
I want to make a word counter using Hdinsight pig.
I created my pig project but when I start my script.pig I get an error:
Could not copy the file "obj\Debug\Pig" because it was not found.

You can use Data Lake Tools for Visual Studio supports creating and submit Pig scripts to HDInsight clusters. Users can create a Pig project from template, and then submit the script to HDInsight clusters.
After creating a script, if you click on the start button, you will get this error message:
Could not copy the file "obj\Debug\Pig" because it was not found.
You need to submit the pig script to the HDInsight cluster you have created.
Successfully submitted the pig script to the HDInsight cluster:

Related

Create Azure Repo for Existing SQL Database

We have an existing SQL database on a SQL Server. We use Azure repos.
I would like to be able to create a repo for the SQL database.
Not sure what the proper steps are.
I can create an empty repo in azure devops and clone it to my local machine.
I can open up Visual Studio, how do I import the SQL database into a new project?
Once I have the SQL project I can push back up to azure devops with sync...push and create pull request.
I would like to be able to create a repo for the SQL database. . I can
open up Visual Studio, how do I import the SQL database into a new
project? Once I have the SQL project I can push back up to azure
DevOps with sync…push and create a pull request.
There are multiple ways you can import your SQL database project into Azure Repos Refer to below:-
I created a Database object and imported it in an SQL Server project in VS code like below:-
There are 3 ways you can import this SQL project from your local machine to Azure DevOps:-
Approach 1)
When you create a new project in Azure DevOps a default repo with the same project name is created in the Azure DevOps organization. you can use the built-in authentication to import your repo in Azure DevOps, refer to below:-
I ran these commands in my VS studio code command line to import the SQL project to the Azure DevOps repo like below:-
git --version
git init
git add .
git commit -m "Initial commit"
Now copy the remote origin code from the Azure DevOps page above Refer above Image - Push an existing repository from the command line and push it to our Azure repos by logging into the Azure DevOps account like below:-
git remote add origin https://<org-name>#dev.azure.com/org/project-name/_git/project-name
git push -u origin --all
SQL project added to Azure DevOps repos like below:-
Now you can set up a build and run the SQL pipeline or even commit changes to your project and push it via git push in the same repo.
Approach 2)
Importing via GitHub.
You can save this entire SQL project in your GitHub account and import it to Azure DevOps like below:-
You can directly paste the Github repo URL and import it directly without any Authentication for the Public repository or Import it with Git authentication with PAT Token like below:-
Click Require authentication > Enter the Username and Password of your GitHub account and get the PAT from GitHub Account Settings > Developer settings > Personal access token > Generate new Token
Github repo got imported in Azure DevOps successfully.
Approach 3)
Import the Project from VS studio code like below:-
You can Push the code from Visual Studio to Azure DevOps Repo with the above option.
You can start creating a release pipeline and import the SQL project as an Artifact and run the pipeline like below:-

Unable to migrate Test Suite and Test Plan

I am using the tool version MigrationTools-11.9.31, where I am able to migrate the all the workitems, shared quires from TFS on premises to Azure Devops services. But while migrate the Test Suite and Test Plan I am facing the issue as follows.
Please find the log file.
My config file below
Please help me out the same. Thank you!.
-Asit

GUI for Azure DevOps no longer available

When I add a new Build Definition in Azure Pipelines the GUI is no longer available. The YAML tool appears.
My team needs to gradually move over to YAML. I know the GUI works for existing pipelines but we would like to create new pipelines with the GUI too.
N/A
Click on the Use the classic editor link when creating a new pipeline:

How to package and deploy serverless framework code in separate steps?

My question is about separate package/deploy steps. What I want to do is to package the service at step 1 of the deployment process, then copy the content of the package to another machine and deploy from there. Can't make it work. I use no parameters, and "serverless package" seems to work fine (creates ".serverless" folder without an attempt to deploy), but when I copy the ".serverless" folder to another location and execute 'serverless deploy" it only says "packaging service" and does nothing. Is this how deployment of a package supposed to work? This happens on a vanilla aws node service.
The command serverless deploy --package path-to-package seems to be what you are looking for, as specified in the Serverless Framework documentation.
This deployment option takes a deployment directory that has already been created with serverless package and deploys it to the cloud provider. This allows you to easier integrate CI / CD workflows with the Serverless Framework.
You were probably missing the --package option.

devops - selenium- jira integration into jenkins

I am working on one devops project, from selenium I am running test script and a log file is getting generated. How to configure jira to read the log file generated by selenium.. I want to go with API approch but unable to do so. Jenkins I am using as a CI tool here. Any suggestions ?
Hmm. Generally it's a better approach to display your test results in Jenkins instead of creating issues for them automatically. You didn't mention what technology you use (nodejs, java, ...), but typically you let your test runner generate a test results file that jenkins can interprete, so it will display the results nicely. There are various jenkins plugins that can help with that.
If you want to go a step further and still create issues automatically, you can script that in a separate step of your jenkins job, using the JIRA REST API and a scripting language of your choice. It just comes down to parsing the results file and let your script create issues for the failures.