Is it possible to call a webserive from a reporting action in SSAS?
What I am trying to do is create a reporting action that will schedule a report to run at certain time by calling the CreateSubscription Web service of Reporting services.
Any advice would be much appreciated. Does anyone have any suggestions or perhaps another angle that I could approach this problem?
Thanks
You can't use a report action since the report action uses URL addressibility. Instead consider a regular action that invokes a web page, e.g. ASP.NET page. The page can call down to the Report Server.
Another approach could be to use a report action which calls a dummy report that invokes some custom code. The report can forward the parameter values while the custom code can invoke the API.
|||Thank you for your reply. I wanted to try and schedule this as a report to run off peak hours as it was a large report. But after testing the subscription based delivery method, I have encountered memory problems which lead me to believe that SSRS may not be the way to go. I am trying to generate a report that is trying to return approx. 500 Mb of data. When the report gets triggered the memory usage of ReportingServicesService.exe climbs until it reaches approximately 14 Gigs (I have a machine with 20 Gigs of memory). After the memory usage reaches 12 Gigs the report service stops servicing any new requests which makes sense since MemoryLimit is set to 60%. This is where the problem begins. It seems that after the report has finished executing it does not want to release this memory and no new requests can be serviced.
I have a few questions that maybe you could help me out with:
1. My first question is why is the report server using up so much memory? It is a simple report with one table, no special formatting, no aggregations being done by the report server.
2. Sql server will release its memory when other processes require it, should Reporting services do the same thing?
3. What are the recommended Memory settings of MemoryLimit and MaxMemoryLimit? Are they the defaults of 60 and 80?
4. I have yet to test setting the Memory Limit settings to a value greater than 100 so that the report server can use virtual memory. What are the implications of doing this? Would you recommend doing this?
My organization deals with Billions of rows of data so it is not uncommon for someone to request a hundreds of thousands of rows for further analysis. If reporting services is not the way to go then I will have to look into creating some SSIS packages or some custom .Net code to export this data.
Any advice would be much appreciated.
Thanks.
|||1. A 500Mb report will do for a Halloween trick. It will be probably useful to look at the ExecutionLog table and see how much time was spent in data retrieval, processing, and rendering. To answer your question, I suspect that the high memory consumption is caused by loading the large dataset in memory. I assume you have applied the latest service packs. I am really curious how would the business users analyze hundreds of thousands of rows.
2. It should. The Windows service (ReportingServicesService.exe) is written in managed code. When the memory is not used, it should be reclaimed by the .NET garbage collector. You may want to report this to the Product Feedback Center. It could be related to the 64-bit version.
3. See http://msdn2.microsoft.com/en-us/library/ms159206.aspx.
4. Again, see the above link.
As a side note, you may want to look at the Analysis Services if you need to push such big datasets.
|||Hi Teo,
1. I've checked my ExecutionLog Table. Here are the values for TimeDataRetrieval, TimeProcessing and TimeRendering (in Minutes): 19.37min, 12.33 min, 7.38 min. Byte count was 516277912 (492Mb) and Rowcount was 642703. If only 492 Mb are retrieved, I still don't understand why the memory consumption is so high (14 Gigs). This report is actually a drillthrough action that exprots the data to a CSV file. My client requires that when they observe suspicious numbers for certain time periods, stores, products etc. they want the ability to drillthrough the underlying records to do some further analysis, send the data back to the data suppliers so they can explain why the volumes are high or low. They may even want to sell the data to other organizations.
2. I will try to report this issue to the Product Feedback Center.
3 + 4. That article is good, but it makes no recommendations whether or not we should set the memory limit values greater than 100 to use virtual memory. I will run some tests to try and see the affects of changing these properties.
What do you mean by "look at the Analysis Services if you need to push such big datasets"? Are you refering to the performance? Or just another mechanism to export the data. Please elaborate. My end users require the ability to browse cubes and then drillthrough to large datasets.
Any advice would be much appreciated.
Thanks.
|||1. I am not exluding the possibility this to be a bug. We also ran into an issue with large reports where the reports will time out. The forthcoming SP2 is expected to fix this.
2. Also, I would suggest you contact the Support Center ASAP and request a hotfix. The best thing will be if you could send them a test harness.
What I meant is that SSAS could be a better choice if the users want to aggregate large volumes of data, e.g. for historical analysis. But sounds like in your case, the end users wants to see a detail report so scratch out SSAS.
|||Also, do you have the same issue if you request the report directly from the Report Server (e.g. using the Report Manager) as opposed to subscribed delivery? In this case, the Windows service will be bypassed and if you face the same issue, the ASP.NET process memory would grow.|||Yes. The same issue occurs with requesting the report directly from the report server. In this case the memory usage of IIS (w3wp.exe) grows to approximated the same as it did with the ReportServer Service. I will continue to search for a solution and post any of my findings.
Thanks.
No comments:
Post a Comment