I have a very simple python program for collecting a number from a website every hour that works great. I would, however, like to create a .csv file that updates stores all of these numbers in a new row every time they are collected but I am unsure how to do this. Can anyone help me out?
Related
I am building an Access 2010 db which will store and query information relating to time spend by users in our team. Part of the reporting needs to include whether timesheets have been submitted on time.
The process is currently being managed in Excel but is becoming cumbersome due to the growing size of the consolidated data. In the current process, the flag on whether someone is late with their timesheet is applied manually.
Instead of manually adding a Yes / No value to the excel data, I wondered whether it was possible to set up separate TransferSpreadsheet processes in Access to upload the excel data (and attach them to separate command buttons) such that, depending on which one is executed, the import process adds a Yes or a No value to the last column in the data as it's being uploaded.
That way we can import the excel data for those who submitted their timesheets on time (and 'stamp' them Yes for being on time) and then any subsequently late submitted timesheet data can be imported later (and 'stamped' with a No).
I have spent several hours looking at online forums and instruction pages but cannot find anything close to what I am trying to achieve, hence the reason for posting this here.
This is just one of the options I am considering but my VBA skills are insufficient to establish whether such a process could be handled in VBA. All help appreciated. Thanks.
Solved this one myself with a bit of perseverance. Ended up running a few DoCmd.RunSQL commands to Alter / Delete / Insert the tables I had and used a 'join' table to load the data from excel and then ran a command to append the data from the 'join' table to the main table. I just invoke slightly different commands to update the table field based on whether the data has been submitted late or on time.
I am using below link to access data from google sheet:
https://sheets.googleapis.com/v4/spreadsheets//values/?key="API_Key"
Now I don't want data every time unless it is modified. Every time I will make API call and if the data is modified then I want those specific data rather than having complete data from the sheet. Or if getting specific data is not feasible then complete data will also work for me.
Any help is appreciated!
you'll still need to get the list of revisions before you get the latest one. Its discussed more in Manage Revisions of the Drive documentation. Once you've called the revisions:list, just pick the Revision you want from the list (conditional loop)
This is an open ended question. I have noob understanding of databases but willing to learn whatever is required. Though I believe my problem could be done without learning a lot.
So, here goes the question:
I have large amount of files getting generated in mt projects(depending on the builds) and I need to archive them and also need to reproduce them according to buildNumber if requested by users. I don't expect these requests to be a lot. May be 1-2 requests a day.
For eg: 16GB data per build every week. Most of the files in weekly builds are duplicate. And I don't want to archive them again and again. I prefer to store them only once. There is one caveat that it can happen that the files relative location can change, even though content hasn't changed.
My approach is as follow: Create a hash from each file. Create the key-value pair as fileHash-actual file and store it. Store this information in some kind of manifest file for each build. So, I should be able to create the builds back with correct files/paths etc.
Can it ever happen that 2 different files will ever have the same hash? Can some database help to do it efficiently? I am currently thinking of dumping all files in one folder.
Thanks
My situation:
At a competition, we will have 6 "scorers" each using a separate android tablet. For every game (there will probably be 70 or 80 throughout the tournament), each person will score accordingly on a custom app that will create a .csv file. (To be clear, each match will result in 6 separate, 1 row, csv files.) The format of the data will be the same from game to game, and from scorer to scorer. I can have control over the names of these files such as "[Scorer#]_[Match###].csv". These tablets will all be connected to a central computer via USB.
What I would like to do:
I would like to be able to have the data from all of those files automatically populate a "database" table on a single sheet. If possible, I would like a folder to act as a "watch folder" of sorts, where, as a new file shows up in a folder, that data is automatically ingested into the table. If that is not possible, I would be happy with a single function I could run to check for new data after each game ended.
I had considered possibly trying to use power query, but wasn't sure if that could lead me to a usable solution.
Any suggestions would be greatly appreciated!
(and I apologize if anything is unclear. I'm happy to clear up any confusion)
Power Query is a good fit in that scenario. You can set up a query that loads all files in a specific folder and appends the contents. Refresh the query when new files have been added to the folder.
For detailed instructions how to set up such a query, take a look here:
http://excelunplugged.com/2015/02/10/get-data-from-folder-in-power-query/
I work with VBA in MS Access databases. I'd like to be able to log when files are saved, modified or deleted without having to update the existing code to do the logging when the pertinent events take place. I want the time, location and the name of the file.
I found a good example here: when file modified
However, it only allows for monitoring a particular location (path). I want to be able to log regardless of where the save, modify or delete takes place. I'm only allowed to program in the MS Office environment in this situation. It seems as though using the Windows API is going to be how this task will be achieved. However, I don't have much experience working with the API. Is there an easier way to achieve what I want that doesn't involve using the API?
Have you worked with After_Updates or After_Insert macros? Also, is your application split? Meaning there's a front-end and a back-end of the database. You can create a separate table that mirrors that table that you need to track changes for. Every time a table is updates, run a macro that inserts a row to that table.
I'm assuming you're saving files to the database. If that's the case, add a after_update or after_insert macro that can keep track of when then files are being modified or added to the table.