When trying to add external data in excel, the data connection wizard does not load properly for some reason.
I select Data,> From Other sources, >From Data Connection Wizard>My Data source>table i want...
Then I have no options to set parameters, can only click "Finish" without any query set up.
Just Defaults to "Select * FROM XXXXX"..
Anyone have any ideas as to why this would be the case?
I have done this exact same process before, on multiple occasions, with to issue.
Something has changed to make this process not work properly.
There are big changes going on in the excel data input environment recently and all the process are being remodeled and reworked. I checked my excel 16 and it doesn't even list the option you mention (Data connection Wizard) anymore. I feel a strong push towards Data model which may not suit me or other, but all other connection methods are gradually ceasing to work.
I would guess you perhaps do not have a latest Excel version and so it just happens.
Related
Started BI Publisher about a week ago.
When working on a new data model, about one or two queries in, I get this error when I try to save:
Failed to load servlet/res?s=%252F~developer1%252Ftest%252FJustin%2520Tests%252FOSRP%2520Information.xdm&desc=&_sTkn=9ba70c01152efbcb413.
I can no longer save my data model.
I tried deleting my queries, logging in and out, turning machine off and on, but no luck.
I'm currently resolved to saving all of my queries locally in notepad.
I can create a whole new data model and it will save fine, but then after two or three queries the same thing happens.
What's going on and why would anyone design such a confusing error message?
Any help would be greatly appreciated.
After restarting your server once you won't get this issue.It happens some time due to the connection problem.so restart should work for this.It resolved my problem.
None of the proposed solutions worked for me. I found out, on my own, that any unnecessary brackets around CASE in a select statement will cause this error. Remove the unnecessary brackets and the error goes away.
Oracle meta link Doc ID 2173333.1. In BI Publisher releases 11.1.1.8.x and up, there is an option to Manage Cache in the Administration section of BIP. This option was also added to 11.1.1.7 in patch 140715 (11.1.1.7.140715).
Clearing the object cache will resolve the saving errors:
Click on the Administration link
Manage BI Publisher
Manage Cache
Click on the 'Clear Object Cache'
I have a series of about 30 Excel reports (.xlsm), which each have a unique connection configured to a database. The connection is just a short SQL script which grabs the data to be shown on the report. The data is populated into a table (not a pivot table).
Every week we need to update these reports, so I use a simple PowerShell script to open each of the files and refresh the connection.
Every so often we need to send the base 30 reports to other work groups so they can manually update the files on their own. This can be a nuisance because some of the reports are very large (30mb+). This makes emailing difficult, and uploading them/downloading them several times a day is just a hassle.
To mitigate this, before we distribute the report templates I try to delete all the rows in the tables, and any unused range. This has helped, but there's still several files that are VERY large (30mb+) even though we've deleted everything in the workbook except the connection, and the empty table.
Through tests, I've realized that if I delete the configured connection, the file size becomes sufficiently small (<1mb) which is what I would expect. This leads me to believe that Excel connections have a sort of cache that needs to be cleared, however I can't find any references for this.
Does anyone know a simple way for reducing the size of a connection in such a way that I could do so programmatically using VBA/Powershell?
If deleting the configured connection reduces your file size enough, you could write a macro to delete your connections and another to reestablish them. As Noldor130884 suggested, you can automatically execute the macros on Workbook_Open and Workbook_Close.
Office Online - Create, Edit & Manage connections to external data
The above reference seems to make the relevant statement below:
"Removing a connection only removes the connection and does not remove any object or data from the workbook."
It looks to me as if the problem is with the formatting. I don't know why but in my files excel reformatted all rows and columns while adding form with data from connection. Thus the sheet was very large but if you check the xml file it shows only formatting data.. Once I deleted manually all "empty" rows the size of the file is normal again. Hope that helps, helped in my case..
I'm trying to load data from my database into an excel file of a standard template. The package is ready and it's running, throwing a couple of validation warnings stating that truncation may occur because my template has fields of a slightly smaller size than the DB columns i've matched them to.
However, no data is getting populated to my excel sheet.
No errors are reported, and when I click preview for my OLE DB source, it's showing me rows of results. None of these are getting populated into my excel sheet though.
You should first make sure that you have data coming through the pipeline. In the arrow connecting your Source task to Destination task (I'm assuming you don't have any steps between), double click and you'll open the Data Flow Path Editor. Click on Data Viewer, then Add and click OK. That will allow you to see what is moving through the pipeline.
Something to consider with Excel is that is prefers Unicode data types to Non-Unicode. Chances are you have a database collation that is Non-Unicode, so you might have to convert the values in a Data Conversion task.
ALSO, you may need to force the package to execute in 32bit runtime. The VS application develops in a 32bit environment, so the drivers you have visibility to are 32bit. If there is no 64bit equivalent, it will break when you try and run the package. Right click on your project and click Properties and under the Debug menu you'll need to change the setting Run64BitRuntime to FALSE.
you dont provide much informatiom. Add a Data View between your source and your excel destination to see if data is passing through. Do do it, just double click the data flow path, select data view and then add a grid.
Run your app. If you see data, provide more details so we can help you
Couple of questions that may lead to an answer:
Have you checked that data is actually passed through the SSIS package at run time?
Have you double checked your mapping?
Try converting within the package so you don't have the truncation issue
If you add some more details about what you're running, I may be able do give a better answer.
EDIT: Considering what you wrote in your comment, I'd defiantly try the third option. Let us know if this doesn't solve the problem.
Just as an assist for anyone else running into this - I had a similar issue and beat my head against the wall for a long time before I found out what was going on. My export WAS writing data to the file, but because I was using a template file as the destination, and that template file had previous data that had been deleted, the process was appending the data BELOW the previously used rows. So, I was writing out three lines of data, for example, but the data did not start until row 344!!!
The solution was to select the entire spreadsheet in my template file, and delete every bit of it so that I had a completely clean sheet to begin with. I then added my header lines to the clean sheet and saved it. Then I ran the data flow task and...ta-daa!!! Perfect export!
Hopefully this will help some poor soul who runs into this same issue in the future!
I'm doing a db in access 2002 and I've had some problems with the locking shared mode. I have an app that has a lot of programs and almost 10 users logged on. And one of things that the app do is open a report, that user choose the program and the number (usually has 4 numbers for each program), and before the report open, I open it in hidden mode and I edit the report for current program/number, with this rotine I just have one report that may turn in hundreds... saving memory and optimizing it. Then I save it and open it again in view mode.
But the problem is when has other person using the app, so the access can't edit and save report... just in exclusive mode!
Has a cmd in vba to allow momentarily changes in shared mode? I don't know, like freeze all user, save and then unfreeze them?
Or any other suggestion?
About create a Front-End/Back-End I think that is impracticable, cause is a beta version and I have to update it often and I already tried to do this also, but it's became too slow... I splitted then in a database(just tables) into network and front-end with all querys, forms, reports and linked tables in local PC, but it really became tooo slow. If someone can help me let it faster would solve my problems too
I splitted the db and I'm trying optimize it. I read a lot about it on the web and I changed all Access setting that I saw that need to be changed and now I get a faster program. But slower than with a single app.
But now there are just fell Forms that making my app slow.
For example I have some Forms that always when I close it I spends a long time waiting it to close ;(
So I realised that this Forms are always saving before close. And always that a form need to save(with linked tables taht are in the network) waste a lot of time, so I need to avoid this.
But I didn't get it so far...
I realise that this forms are saving because in form_open I hidde some columns(that are different for each program) and edit it caption. And then if I need to close form, it saves and waste this such time!
How could I hidde/edit this columns withou need to save form? Or how could I close form without save structure changes?
I know how to do that with just a button, but these forms are datasheets and I can close it only in "X" Form button. And unfortunattely Access dont have BeforeClose event, and in a OnClose event it save before go to this sub!
See if you can make use of a WhereCondition with the DoCmd.OpenReport Method to avoid the need to modify your report's design at run time.
The WhereCondition is applied to your report's existing record source query as if it where written into that query's WHERE clause.
So if the record source for YourReport is ...
SELECT program_id, some_number, another_field
FROM YourTable;
... then this ...
DoCmd.OpenReport "YourReport", _
WhereCondition:="program_id = 7 AND some_number = 22"
would give you the same set of rows as would revising the record source to this ...
SELECT program_id, some_number, another_field
FROM YourTable
WHERE
program_id = 7
AND some_number = 22;
This advantage of this approach, if you can make it work for your situation, is that you would no longer need exclusive access to the db since you're not actually changing the report's design.
The related issue about beta status making it impractical to split the application is something you should re-consider carefully. Splitting ensures you can easily preserve the data in the BE when you roll out changes to the FE application. Even if you've come up with another method to avoid losing data when you change versions, that method can not be simpler than segregating the data into a BE file.
And when you split the application, each user should get their own copy of the FE file which is stored locally on their machine's hard drive; those FE files will contain links to the tables in the BE file which is stored on a file share.
Keeping the users' FE applications updated as you release new versions is a problem which has been solved. For example, see Tony Toews' Auto FE Updater. And you can find other approaches by searching the web.
If your concern is performance with a split application, check Tony's Microsoft Access Performance FAQ.
Just a little background: I am using Access 2010 to create forms and VBA code in an Access 2003 format database. For some reason, Access 2007 format databases always corrupt on me when I make changes and save them with a particular group of objects, but that's for another discussion.
When writing VBA code in this Access 2003 database, any time my code breaks (via breakpoint or an unhandled error) and I make a correction, Access tells me that it can't save back to the database because another user has it open. However, I am the only user working on the database; this is a local copy of the database and it's sitting on my desktop.
The LDB file can't be deleted because Access is using it. When I first load the database, I see my machine name and "Admin" when opening the LDB in a text or hex editor. After a break, I see that plus a duplicate entry, but this time around "admin" has a lower-case "A."
Closing the database and reopening it fixes the problem but makes it needlessly cumbersome to debug my code. Anyone else encounter this issue and/or have a fix for it?
It might be helpful to know what your code is doing when this happens. Certainly that's not normal behavior. For instance, are you opening another database with New Access.Application? Are you using ADO or DAO to access records in the database with a connection string?
There are no external connections to the database at all.
It may not matter if there are external connections to the database if you are using a connection string to connect to the open database; not sure but that may be seen as an external connection... you may want to use CurrentDB for DAO, or CurrentProject.Connection as your ActiveConnection for any ADO queries.
I am assuming that this problem persists through reboots; but for the sake of argument, try closing out Access and going to the task manager to make sure you have no other instances of MSAccess.exe running. You might even try closing all Office products and/or making sure that Access is the only Office product running. I have seen some weird conflicts between Microsoft Communicator and Outlook; so it's not entirely out of the question for Access to have issues with another MS product.
You may also want to check the size of the database to make sure it's not exceeded 2GB. That causes the infamous "Invalid parameter" error; perhaps it might be causing this as well.
With no other details about how your program works, we may only be able to offer generic advice like this.
I have discovered a way to cause the problem discussed above (and thereby to correct it). Turns out if you create a database object and set it to the current database, you get this problem.
That is,
dim cdb as database
set cdb = currentdb
From this point on, you're cooked.
Instead, figure a way around this by possibly using currentdb directly or not using it at all.
This worked for me.
In your VBA Try checking that all your open Connections to the database are closed. Until the connection is open the LDB fill will be there.
Same symptom of not being able to save form or code mods after application had started. I found a workaround today! In the startup of my first form of the app, I had issued a "DAO.DBEngine.SetOption dbMaxLocksPerFile, 20000". Commenting this statement removed the problem. I did no further testing, but FYI, the DBEngine call was before any reference or attempt to use CurrentDB(). Also the current default on my Access 16 install is 9,500.
I thought I might answer here, since I stumbled upon this question while having a similar issue. Essentially, it boiled down to this: I could either edit forms, VBA, etc. or edit information in the local database (which I'm using as a cache) with currentDB. I also have a backend database, but the locking was clearly on the frontend database.
The solution ended up being weird, but stupidly simple. When the frontend starts up, I have it immediately create a connection to the backend using OpenRecordset (and similarly to you, that backend was still on my own computer for testing purposes). I tried temporarily disabling that code, and suddenly it wasn't an issue anymore. And it turns out, once I call currentDB, I can then call OpenRecordset to open the connection to the backend, and suddenly it isn't a problem anymore.
Tl;Dr: if you're calling OpenRecordset somewhere in your code to connect to a backend, be sure to call something like set db = currentDB beforehand, then everything works. (That is, probably until I publish this answer and Access then decides it doesn't want to anymore).
Why this fixed it is beyond me, someone with more knowledge can maybe answer that.
The solution:
options > current database > click enable -track name auto correct info