I have a unique issue that I need to deal with. I have Multiple Drill-Thru running from Source to Target Report. So, Report A has multiple Drill-Thurs
from each Intersection to different Drill-Thru Reports. It was a little tricky
but I got it working.
So, now, I have to use a item value or parameter to send from the Source to the SAME EXACT Target Report but for different filters that the Target Report can use to show different information.
So, Source A ---> Target A, based on parameter or Value of "Jaguars" and the Target will return only "Jaguars" information.
If Source A ---> Target A, based on parameter or Value of "BMW" and the Target will return only "BMW" information.
Since there is no parameter in the Prompt Page for the Car Type, I have to some how setup a value as "Jaguar" and send it as a value to the Target report. Same thing for the "BMW".
So, does anybody knows how I can setup a value in the Source and send it as a part of the Drill-Thru definition to the Target Report.
Thx so much for your time.
Related
I am creating a print out for BOM using advanced pdf. I am trying to get get the value of the Start Date from the work order but in the print out, it is empty. I set this using ${record.startdate}. Is this correct? Is there another internal id for start date and end date in the work order?
Thanks!
If the object passed to the pdf generator name is "record" offcourse you can reference is as it is. It depends on the passed parameter name. If you are modifying existing html check the rest of the script and you will see if its 'record' or not.
You can reference with ${record.startDate} and ${record.endDate}
All the available fields for the object can be found here.
I am trying to show images for products inside a basic report. The image needs to be dynamic, meaning the image should change based on the SKU value.
Right now I am inserting an image into a table, setting to external, and i've tried:
=Fields!URL.Value
=http://externalwebservername/sku= & Fields!SKU.Value
="http://externalwebservername/sku=" & Fields!SKU.Value
I do not get any images in my table.
My stored proc has all the data, including a URL with the image I wan't to show. Here is a sample of what the URL looks like:
http://externalwebservername/sku=123456
If I enter the URL in the field without "=" it will show that ONE image only.
How should I set up the expression to properly show the external image based on a dynamic URL? Running SQL 2016
Alan's answer should work, but in our environment we have strict proxy/firewall rules, so the two servers could not contact each other.
Instead we are navigating to the file stored on our storage system.
We altered the URL column to point to file path in the stored procedure. Insert image, set Source to External and Value set to [URL].
URL= file://server\imagepath.jpg
As long as the account executing the report has permissions to access the URLs then your 3rd expression should have worked.
I put together a simple example as follows.
I created a new blank report then added a Data Source. It doesn't matter where this points, we won't use it directly.
Then I created a dataset (Dataset1) with the following SQL to give me list of image names.
SELECT '350x120' AS suffix
UNION SELECT '200x100'
UNION SELECT '500x500'
Actually, these are just parameters for the website http://placehold.it/ which will generate images based on the size you request, but that's not relevant for this exercise.
We'll be showing three images from the following URLs
http://placehold.it/350x120
http://placehold.it/200x100
http://placehold.it/500x500
Next, create a table, I used 3 columns to give me more testing options. Set the DataSetName to DataSet1 if it isn't already.
In the first column the expression is just =Fields!suffix.Value
In the second column I added an image, set it's source property to External and the Value to ="http://placehold.it/" & Fields!suffix.Value
I then added a 3rd column with the same expression as the image Value so I could see what was being used as the image URL. I also added an action that goes to the same URL, just to check the URL did not have any unprintable characters in it that might cause a problem.
The basic report design looks like this.
The rendered result looks like this.
When I do an extract from multiple files and include part of the filename in the fields list and in the FROM clause (e.g. FROM "/input/filename-{filedate:*}.nc"), the resulting output file only contains a header row. If I remove "filedate" from the fields list and the FROM clause, I get the correct output.
I noticed in the job graph that when including "filedate", an "Empty Input" and an "Extract Cross" step is added before the "PodAggregate" step, and in the "Extract Cross" no data is written. What is this step?
Also, if I run the original extract including "filedate" locally, I get the correct output, so it's only in ADLA this error occurs.
I use a custom extractor and I don't know if this has anything to do with it. I haven't tested with a built-in extractor.
We released the new "fast file set" option by default. Unfortunately, it introduced a regression for some plans. Until we fix it, please add the following statement to your script:
SET ##InternalDebug = "FileSetV2:off";
Our apologies for any inconvenience this may have caused.
FYI to start, I am aware of how to properly set up an update to a lookup, and am 99% positive I've done this correctly.
I know this because When I set the workflow to automatically start when an Item is Changed, then it works perfectly. But when I simply change this setting so it will automatically start on New Item Creation, it Cancels the workflow and I get a "Coercion Failed: Unable to transform the input lookup data into the requested type." If both options are checked then it fails on creation, but simply clicking edit on the item properties, and the "Save" makes it work.
The workflow is on a Document Library and works as follows;
User selects the Work Task LookUp from a dropdown in the edit properties form after uploading, and then Saves the item (adding it to the document library). The workflow is suppose to then look at the Work Task LookUp selected, and pull the Account and Effective Date-Type lookUp ID's that Work Task item has, and sets the Document's identical fields to the same value.
Here is the code for the workflow if it helps;
If Current Item: Parent Task is not empty
If Current Item: Sub Task is not empty
Log Both are empty to workflow history list
Then Set Account to Work Tasks:Account
The Log Set Account to workflow history list
Then Set Effective Date and Type to WorkTasks: Effective Date and Type
The Log Set EffDateType to the workflow history list
This is all done in one step. I also added additional steps to test if the account and effective date type fields have been set properly, and if not to set them again. But everytime I run the workflow on change and it works, it always correctly sets these fields based upon the first Step (posted above) and the additional check logs to the history that they are not needed.
As an example, The lookUp for Integer for Tasks:Account is set to work as follows;
Date Source: Work Tasks (a list)
Field from Source: Account (a lookup)
Return Field as: Lookup ID (as Integer)
Find the List Item
Field: Title (from the Work Tasks list)
Value: Current Item: Parent Task (Which is a look up of the "Title"
Field from Work Tasks List, and is set to return the Value as a LookUp Value (As Text))
The Effective Date and Type setting is pretty much identical.
So anyone have any insight? I've tried running it as an impersonated Step, setting a workflow pause (for 1 minute), changing the lookup types incase I messed it up to start with, but ultimately the above workflow DOES work, but only when I set it to "Automatically start on the Change (edit) of an Item", NOT "Automatically start on New Item Creation" like I need to to do.
Oh yes, fyi, I am using SPServices CascadingDropDown on the Work Task and Sub Task fields of the doc Library form, but I honestly do not believe this has anything to do with my issue.
UPDATE:
I've talked with another developer, and he believes it is due to the issue that the workflow is occuring too quickly, before the item creates an ID for itself, which it needs to conduct the lookUps. He had me add another "Pause Workflow" to the very top of my workflow code (above the If conditions) and set it for 1 minute.
It then worked properly.
Downside is we want this to labeling to occur as close to item creation as possible. Because a view of the library relies on grouping based upon Account and Effective Date and Type. To add to this downer, Microsoft's Pause Workflow only allows for 1 minute or more, and then the timer used for this is often off, resulting in a pause longer than that. So far, every test is currently showing 2 minutes minimum on the pause.
A possible alternative solution for instantaniously populate the fileds is to use Javascript and SPServices to do the lookUp to the Task list to pull the account and effective date - type fields and then populate, but my Javascript is not very strong and I would need help doing this. If anyone has any suggestions, I would appreciate them.
(Answered in a question edit. Converted to a community wiki answer. See Question with no answers, but issue solved in the comments (or extended in chat) )
The OP wrote:
I don't know if it is the ID for the item after further testing. I changed the start of the workflow to wait until a field in the item changes. I set it to wait until the ID field is not 0 (since you cannot set to null), and it still does not work.
6/14/2012 4:13 PM Comment System Account Waiting on ID
6/14/2012 4:13 PM Comment System Account Waiting complete on ID
6/14/2012 4:13 PM Error System Account Coercion Failed: Unable to transform the input lookup data into the requested type.
I have tried other fields as well, like document ID value is not empty, and it will wait, log it finishing the wait, and then fail.
UPDATE This issue has something to do with the Parent Task field. I have solved the issue without having to wait for a period of time by setting the change from above to wait until the Parent Task field is not empty. It then completes the workflow fine.
Anyone know why there is a delay though? I've solved it, but still don't fully understand what takes it so long.
The main fault has been solved (hence the answer), and the remaining point about the reasons for the delay would probably be a discussion point or not specific enough for SO. Any further clarification can be edited in here.
In Logging Application Block in Logger.Write it takes event id as one of the parameter which is integer.So how to decide what should be passed as event id?
btw, do you really need to use the eventId? I think you can just pass the string you want to log:-
Logger.Write("SomeMessage");
EDIT :- I meant there should be another overload which takes just the string you want to write.
EDIT :- From here :-
EventId - a value you can use to
further categorise Log Entries
(defaults to 0 for a LogEntry and to 1
for a LogEntry implicitly created by
Logger.Write);
What we do is gather the different "stories" that you want to report on and then assign a sequence of event IDs to each of those stories. So in short, come up with a system that works for you and document it for future reference.