i have this homework to do, but i just dont know how to attack the problem, i need to make a program in python that can make consults between an excel database and an sql database in the same program without changing the database my teacher say that i can use json but i dont know how to do it, i dont want the code, i just need the tools for solve the problem thank you
If you do end up using python, I would recommend openpyxl for reading and writing to an Excel sheet, and MySQL connector to connect to your database. I find this combo quick and intuitive to use.
Related
I'm making my project for a-level computing and I reached a problem that im not sure how to solve. Im making something similar to a student information management system and I want a way to store a small list of students maybe 5 to 10 preferably in my program and make reference to them from all forms and so that if something is changed on one of the students maybe a piece of information, that these information is carried to the next form. My teacher has very little knowledge on programming and so im kind of stuck, i have no previous experience with databases or sql however if someone is willing to break it down ill be very grateful and ive got a good understanding of arrays. My deadline is the 10th of may so asap please, thanks
- kyle
If you prefer simplicity I would stick to a CSV file or XML. I found a website which an end-to-end tutorial on how to create the XML and add items (students) to it using VB.net as according to your tag. You need only after to read them from the file, but might just as well add new:
http://vb.net-informations.com/xml/how-to-xml-in-vb.net.htm
I would stick with XML (or CSV as preffered) as it is a text file basically so you can see and make changes directly if needed.
I am new to Access, I am a C programmer who also worked with Oracle. Now I am creating an Access database for a small business with Access front-end and SQL Server back-end. My database has about 30 small tables (a few hundreds records each) and a rather complicated algorithm.
I don't use VBA code because I don't have time for learning VBA, but I can write complicated SQL statements, so I use a lot of queries and macros.
I am trying to minimize the daily growth of my database. I've thought about splitting the Access DB. It doesn't make sense because my DB is rather small. After compacting its size is about 5 MB. The regular compact procedure is not convenient because my client's employees work from home any time they wish. So I need to create a DB that would bloat as slowly as possible.
I did some research and found a useful info: "the most common causes of db bloat are over-use of temporary tables and over-use of non-querydef SQL" (http://www.access-programmers.co.uk/forums/showthread.php?t=48759). Could somebody please clarify that for me? I have 3 questions about that:
1) I cannot help using temporary tables, I tried re-using the same table names in 2 ways:
a) first clear all records and then run an append query or
b) first run a Macro command "DeleteObject" (to free the space in full) and then re-create the temporary table.
Can somebody please advise which way is better in order to reduce the DB growth?
2)After running a stored query I cannot free the space like I did in C using VBA statement "query.close" (because I don't use VBA). But I can run Macro command "close query" after each "OpenQuery". Will it help or just double the length of my Macros?
3)Is it correct that I shouldn't use Macro commands RunSQL for simple SQL statements and create stored queries instead? Even though it will create additional stored queries.
Any help would be appreciated!
Ah the joys of going back to lego after being a brickie! :)
1) Access is essentially a text-based file system. When you delete a record or a table, is persists in the file but with a flag which marks it to be ignored. When you compact an Access db, the executable creates a new file, and moves everything unmarked into that, then deletes the old file. You can see this actually happening if you use Windows Explorer to monitor the folder during this process.
You mention you are using SQL Server, is there a reason you are not building the temp tables on the server? This would be a faster, cleaner and all-round more efficient solution - unless we've missed something. Failing that, you will have to make the move from macros, but truthfully, if you can figure out C, then VBA will be like writing a memo!
http://www.access-programmers.co.uk/forums/showthread.php?t=263390
2) issuing close commands for saved queries in Access has no impact on the file-bloat issue, they just look untidy
3) yes, always used saved queries, since this allows Access to compile the SQL in advance, and optimise execution.
ps. did you know you can call SQL Server Stored Procs from within an Access saved query?
https://accessexperts.com/blog/2011/07/29/sql-server-stored-procedure-guide-for-microsoft-access-part-1/
If at all possible, you should look for ways to dispense with the Access back-end, since you already have SQL Server as the backend - though I suspect you have your reasons for this.
So I have a VBA macro which I put together quite recently, and does an adequate job, if painfully slow. However, I have been told to port it to VB.net (various reasons, a main one being that the Software team don't want to be stuck supporting VBA macros if I move on).
A key part of the process in VBA was running a couple of lookups on another sheet in the same workbook.
The table in question is ~10,000 values, and looks something like:
Location | Ref-Code | Type
Aberdare | ABDARE | ST
I can put the columns in any order, but what I need to do is check that a value is found in Ref-Code, and if it is return Location and Type.
So, first sub-question: is SQLite the right tool for doing this? Would something else be more sensible for looking up values in a persistent, rarely-altered 10,000ish row table from VB.net?
If SQLite is the right tool, are there any good tutorials to take me through how to connect to and query an SQLite database in VB.Net? I haven't been able to find one yet.
Thanks in advance.
Why not just use an xml file to store the lookups. There are loads of easy ways to parse xml files in VB and this way you don't have to learn how to connect to SQLlite at all.
The xml file can also be maintained by someone who doesn't know anything about databases.
I need some sort of starting point for the following task:
Some kind of script should watch a certain folder for incoming excel files. These excel files all have a worksheet with an equal name. The script should then take a certain range of columns of each excel file run down all the rows and write the column data to a running Microsoft SQL Server table.
Now I don't know what scripting language I should/could use for this task. Maybe perl or windows power shell?
Hope you can point me to the right direction.
UPDATE: shouldn't I also look into SSIS since this seems to offer quite a lot of features?
Thank you.
You can create a Windows Service that can monitor a certain folder for a certain interval (say 10 minutes) using .Net.
Using ADO.Net, you can connect to both Excel workbooks and SQL sever perform SQL style data transformations. If the Excel doc isn't conducive to performing SQL queries, there's alway MS Office interop to interface with Excel to select specific values of cells (this tends to be more difficult than the former).
I would probably write a script in Perl or Python trying to call the file in the folder, and if successful, parse the data into either dictionaries/hashes (would be really easy to break rows and columns into a hash), or arrays, making it easy to write to a database.
Then (my knowledge is better in Python, sorry) use the PyODBC module, or whatever other module is necessary to connect to the server, and start writing. I am sorry if this is not as helpful, I am a beginner.
We have a large database with hundreds of tables that have lookup text stored in them with spelling errors from when the application was originally written offshore. Is there a way to run a spellcheck on the data in sql server so we can find all of these errors quickly?
One thought - You could write a CLR function to access the spell checker included with Microsoft Word. See: Walkthrough: Accessing the Spelling Checker in Word for a starting point.
Export your data to Excel.
Distribute the sheets to multiple people to break up the work load
Have them run spell check to identify the misspelled words.
Gather up the bad records and create update statements from the db.
Don't offshore applications where English spelling is important.
EDIT: I should have pointed out, you might use the spell check to identify the issues but you want human eyes on the actual data as well as suggested fixes. Thus, being able to spread the work around to run the spell check is important. There won't be any fully automated solution that will catch everything, or worse it will catch too much and muck the data up worse.
Just stumbled across this one.
Another way to do this is using Microsoft Access, using ODBC, link to the tables on SQL side, then open the table in access and run the spellcheck in Access, you can update the corrections on the fly.