Wednesday, September 3, 2014

SSRS ReportServer Database Diagram

Ah!!! This is my first blog !!! Feels very good to get started !!!

In recent past it happened for me to work with SSRS reporting and i had to extensively work with the ReportServer database for getting the details about the reports. But when i started working i came to know there is no proper documentation for the ReportSever database. In the internet some information is available and it is scattered all over. So thought of creating a small documentation for the SSRS ReportServer database tables.

As a first step , i have created SSRS ReportServer database diagram for analyzing the relationship between the tables. You can download the PDF version of the file from the below link.

https://onedrive.live.com/redir?resid=EF9A407250207BDE!125&authkey=!AHefe5EwjU8GOGs&ithint=file%2cpdf

Thanks for reading my first blog. I will continue to write more in details about the ReportServer Database objects in my forthcoming blog "SSRS - ReportServer Database Tables Explored"

Tuesday, September 2, 2014

SSAS – Aggregation Design - Usage based optimization wizard - There are no queries in the log for selected measure group


Recently I got a scenario from my friend where she was trying to migrate the query log from production to development. While trying to create the aggregation design from usage based optimization wizard it was showing the message “There are no queries in the log for selected measure group”. She had already followed the below helpful link by updating the query log table with proper server path.


But still same message was getting displayed. We have followed the following steps to troubleshoot the issue.

1. Start the profiler pointing to the SQL Server instance in which the query log table is stored.
2. Choose the below events in the profiler
    • SQL:BatchStarting
    • SQL:BatchCompleted
    • SQL:StmtStarting
    • SQL:StmtCompleted

3. Start running the Usage based optimization wizard.
4. The profiler will show the queries issued against the database. In my environment it shows query as below



Select  Count( MSOLAP_ObjectPath ),  Count( Distinct MSOLAP_User ),  Count( Distinct Dataset ),  Avg(Duration ),  Min(StartTime ),  Max(StartTime )   From  [OlapQueryLog]  Where  ( [MSOLAP_Database] = N'AdventureWorksDW2012Multidimensional-EE' )    And  ( [MSOLAP_ObjectPath] = N'MyServer.AdventureWorksDW2012Multidimensional-EE.Adventure Works.Fact Internet Sales 1' )

With the above query we can clearly verify the update in the query log is fine or not. In our case the issue was with the update of the server name.One more point we noticed is the database name and solution name should be the same. The Object path was being sent with the solution name.

Hope this tip will be helpful while debugging this issue. 

Monday, June 30, 2014

Chennai SQL Server User Group - June 2014 Session - Over view of Data Services on Cloud OS

On Chennai SQL Server User Group , presented a topic on "Overview of Data Services on Cloud OS". This topic covers below topics in Windows Azure

1. Handling Structured Data
2. Unstructured Data
3. Gaining Insights
4. Other Services

The slide deck can be downloaded from the below link.
Presentation deck

Friday, May 30, 2014

SSIS - Split data into multiple destination Files

Recently i was working on a requirement on exporting the data into excel file. As the data may exceed more than millions the data needs to be split into multiple excel files as row limitation is 1048576 i.e, if the source returns 3 million rows the data needs to be split into 3 destination files.

For this i have used the script component destination in the data flow task to achieve this


The source query for the data source has one single column "CSVCol"

SELECT 
convert(varchar(50),D1.DateKey)+ ',' + 
convert(varchar(50),D1.FullDateAlternateKey)+ ',' + 
convert(varchar(50),D1.DayNumberOfWeek)+ ',' + 
convert(varchar(50),D1.EnglishDayNameOfWeek)+ ',' + 
convert(varchar(50),D1.SpanishDayNameOfWeek)+ ',' + 
convert(varchar(50),D1.FrenchDayNameOfWeek)+ ',' + 
convert(varchar(50),D1.DayNumberOfMonth)+ ',' + 
convert(varchar(50),D1.DayNumberOfYear)+ ',' + 
convert(varchar(50),D1.WeekNumberOfYear)+ ',' + 
convert(varchar(50),D1.EnglishMonthName)+ ',' + 
convert(varchar(50),D1.SpanishMonthName)+ ',' + 
convert(varchar(50),D1.FrenchMonthName)+ ',' + 
convert(varchar(50),D1.MonthNumberOfYear)+ ',' + 
convert(varchar(50),D1.CalendarQuarter)+ ',' + 
convert(varchar(50),D1.CalendarYear)+ ',' + 
convert(varchar(50),D1.CalendarSemester)+ ',' + 
convert(varchar(50),D1.FiscalQuarter)+ ',' + 
convert(varchar(50),D1.FiscalYear)+ ',' + 
convert(varchar(50),D1.FiscalSemester)+ ','  AS CSVCol
FROM DimDate D1

Note : This logic can be implemented inside the script task itself to concatenate the columns into the comma separated values.


The following variables are used for the below purposes


Bucket Size - For how many records a new file needs to be created
Filename     - Name of the file. Multiple files will be created with _fileno Ex- Sorna_1
FilePath       - Path in which file will be created
Header Row - In each file header row needs to be added. For this demo it is hard coded. This also can be achieved dynamically using expressions or other mechanisms


Following namespaces needs to be added in the script component code

 using System;  
 using System.Data;  
 using Microsoft.SqlServer.Dts.Pipeline.Wrapper;  
 using Microsoft.SqlServer.Dts.Runtime.Wrapper;  
 using System.IO;  
 using Microsoft.CSharp; 

Below is the code written in script component

public class ScriptMain : UserComponent   
  {   
   int RowCount = 0;   
   int batchcount = 0;   
   int FileCount = 0;   
   int BucketSize = 0;   
   String Filename = "" ;   
   String RunFileName = "";   
   String Filepath = "";   
   String FileFullpath = "";   
   String Excelfilepath = "";   
   string HeaderRow = "";   
   StreamWriter writer;   
   Object mv;   
   public override void PreExecute()   
   {   
    base.PreExecute();   
    RowCount = 0;   
    batchcount = 0;   
    FileCount = 1;   
    Filepath = Variables.FilePath;   
    Filename = Variables.FileName;   
    BucketSize = Variables.BucketSize;   
    HeaderRow = Variables.HeaderRow;   
   }   
   public override void PostExecute()   
   {   
    base.PostExecute();   
    writer.Dispose();   
   }   
   public override void Input0_ProcessInputRow(Input0Buffer Row)
   {   
    if (RowCount ==0)   
    {   
     RunFileName = Filename + "_" + FileCount.ToString();   
     FileFullpath = Filepath + RunFileName + ".csv";   
     writer = new System.IO.StreamWriter(FileFullpath);   
     writer.WriteLine(HeaderRow);   
    }   
    writer.WriteLine(Row.CSVCol.ToString());   
    RowCount++;   
    batchcount++;   
    if (batchcount == BucketSize)   
    {   
     writer.Dispose();   
     FileCount++;   
     batchcount = 0;   
     RunFileName = Filename + "_" + FileCount.ToString();   
     FileFullpath = Filepath + RunFileName + ".csv";   
     writer = new System.IO.StreamWriter(FileFullpath);   
     writer.WriteLine(HeaderRow);   
    }   
   }   
  }   

The package is available in the below link

http://1drv.ms/1o44vnJ

This package will create the files as CSV. This also can be parameterized by putting this in a variable.

Hope this post is helpful for this scenario.

Monday, April 7, 2014

Code: 0x00000001 Exception has been thrown by the target of an invocation.

Code: 0x00000001 Exception has been thrown by the target of an invocation error was thrown when I was trying access the excel files from script task in SSIS. The SSIS package works fine when running from development tool. But throws an error while called from SQL Server Agent job. This can be fixed be following steps

     1.       Go to Run and Launch DcomCnfg.exe



      2.       In the component services  window  , Navigate to Component Services à Computers à My Computer à DCOM Config


        3.       Find “Microsoft Excel Application” on right pan. Right click and select properties.



        4.       In the properties go to Identity tab. Choose the option “This User” and provide the user and password with which the excel will be launched while called programmatically.





Thursday, November 21, 2013

SSIS - Use Dynamic SQL in Oracle Source of Oracle Attunity connector

Recently I was working with Oracle Attunity source to pull the data from Oracle source. There was a requirement we need to pass a dynamic SQL to the source. But what I could see in the Oracle source only two options Table name & SQL Command. There is no option for sending SQL Command as variable.





But by the below way we can pass the dynamic SQL to Oracle source.1.Choose the data flow task in which the Oracle source is implemented.2.Go to properties and click on the Expressions.3.There you can see the properties [Oracle Source].[SQLCommand] .4.Set the expression for the above property for the dynamic SQL.




Sunday, November 17, 2013

Chennai SQL Server User Group - November 2013 Session - SQL Server 2014 New Features


On November also i gave a session at Chennai SQL Server User group on SQL Server 2014 New Features

The slide deck and the sample codes can be downloaded from here

http://sdrv.ms/1jfoYlU